Jump to content

Hardware acceleration and x265 encodes


rudrax

Recommended Posts

You need a software codec like LAV filters (KL Codec Pack) to do the software decoding.. 

See if it works or use potplayer as your vid player.. 

Link to comment
Share on other sites


  • Replies 65
  • Views 4.1k
  • Created
  • Last Reply
2 hours ago, teodz1984 said:

You need a software codec like LAV filters (KL Codec Pack) to do the software decoding.. 

See if it works or use potplayer as your vid player.. 

LAV filters are already included in MPC-HC. Pot player also uses 25% CPU. Software decoding will eat more CPU no matter whatever you do. I am curious if CoreAVC or DivX codecs can do anything about it. 

Link to comment
Share on other sites


CoreAVC or DivX  are for 8-bit h264 files. LAv does a better job at it now

 

CoreAVC supports all H.264 Profiles except for 4:2:2 and 4:4:4.

Link to comment
Share on other sites


37 minutes ago, teodz1984 said:

i-cores are more suited to these encodes...

Yeah, Intel i-series cores are great. When I was buying this laptop, I just needed it to watch movies and surf the web. Not for any kind of games. Now see, you need to get a good CPU in order to play some videos efficiently. Good Lord!

Link to comment
Share on other sites


while pentiums are good budget options...

If you are into multimedia consumption.. I cores are recommended..

Link to comment
Share on other sites


4 hours ago, rudrax said:

LAV filters are already included in MPC-HC. Pot player also uses 25% CPU.

Some people had worse problems than this using  Chrome web browser .

https://productforums.google.com/forum/#!topic/chrome/GKRp_Ucb_T4

Back before i switch too using Ublock Origin and used ABP Firefox was very heavy . I use too complain about it back when I use too watch only h264  .That browsers took up more CPU that anything i used all the time.   If you thank 25%  is bad you should try encoding were you use like 90% cpu lol. I encoded a lot of videos h264  on Intel and AMD at high cpus  A properly set up system (ie, with decent cooling) won't be bothered by running at 100 percent for extended periods. :P 

 

 

The only way too see if yore PC is getting too hot is watch a whole movie and use something like  CoreTemp to test  it.

http://www.alcpu.com/CoreTemp/

You should worry if it goes above above 75+ I watch videos 720p 10bit h265 mine stays below 50c and that's  running Firefox  , Potplayer  and some other programs too .I will fire up my dell mini tower with intel and test latter on this i test on my Gateway AMD. Even playing full 1080p 10bit h265 1920 x 1080 mine stays below 60 c with Firefox and other programs running in the back ground.

Link to comment
Share on other sites


  • Administrator

As someone who is not an avid movie watcher, this 8-bit vs 10-bit video does confuses me. I doubt I have ever seen 10-bit videos myself, so I wonder if they are common for 1080p downloads.

 

Eitherway, one thing I must mention. The tool I tried which posted on this thread, thinks that my graphics card supports 8-bit HEVC. But the truth is, my graphics card only supports HEVC encoding, not decoding. My CPU has no chance of supporting it either. All this prevents me from converting all my videos or switching to H.265.

 

I must mention an really interesting thing though. Some time ago, I was searching for what should be the ideal bitrate for H.265 codec and hence thought I should look what the scene thinks. I found discussions made a year ago and a lot of them had one thing common, they somehow did not like H.265. The reason they gave is that H.264 has gone through so many revisions by itself that when you compare it to H.265, the H.264 is far more stable and worked on. Compare this to how the i3/i5/i7 series using same architecture throughout the years and improving upon it with AMD's new Ryzen which uses new architecture and needs to be worked upon. This probably was about the encoding part rather than decoding part as they were angry that the amount of resources the new codec requires to both encode - time and CPU and decode is too much. What they also suggested that it's a wrong thing to think that H.265 cuts the video bitrate in half, as this is the maximum one should do depending on the video, the ideal bitrate cut might be 25-33% not 50% as people think. Still, I'm not an expert in this and given opportunity, I will still prefer H.265 over H.264 if possible to do so.

Link to comment
Share on other sites


2 hours ago, DKT27 said:

As someone who is not an avid movie watcher, this 8-bit vs 10-bit video does confuses me. I doubt I have ever seen 10-bit videos myself, so I wonder if they are common for 1080p downloads.

 

Eitherway, one thing I must mention. The tool I tried which posted on this thread, thinks that my graphics card supports 8-bit HEVC. But the truth is, my graphics card only supports HEVC encoding, not decoding. My CPU has no chance of supporting it either. All this prevents me from converting all my videos or switching to H.265.

 

I must mention an really interesting thing though. Some time ago, I was searching for what should be the ideal bitrate for H.265 codec and hence thought I should look what the scene thinks. I found discussions made a year ago and a lot of them had one thing common, they somehow did not like H.265. The reason they gave is that H.264 has gone through so many revisions by itself that when you compare it to H.265, the H.264 is far more stable and worked on. Compare this to how the i3/i5/i7 series using same architecture throughout the years and improving upon it with AMD's new Ryzen which uses new architecture and needs to be worked upon. This probably was about the encoding part rather than decoding part as they were angry that the amount of resources the new codec requires to both encode - time and CPU and decode is too much. What they also suggested that it's a wrong thing to think that H.265 cuts the video bitrate in half, as this is the maximum one should do depending on the video, the ideal bitrate cut might be 25-33% not 50% as people think. Still, I'm not an expert in this and given opportunity, I will still prefer H.265 over H.264 if possible to do so.

So the lurker finally popped up :spank:

H.265 has become mainstream since 2015 and it is now on transition phase. So it is obviousto have some compatibility issues. The 8bit, 10bit thing represents the bit depth of the encoded video. The more it is, the more vivid will be the color. 8bit is fine for most of the displays. 10bit encodes will look better on displays like - Retina, AMOLED/SAMOLED, Q-LED etc. For the rest, 8bit is good enough. You can see now that Kaby Lake supports full encode/decode of 10bit HEVCs, so there will be no problem as CPU resources will be free. 

 

HEVC cuts down the video size by almost half. 400MB of H.265 video looks same as a  700MB H.264 video. So it is a very good thing.

Link to comment
Share on other sites


10 hours ago, teodz1984 said:

But if you take the encoding side... most will choose not to use h265 because it is slow compared to h264... A movie that rakes an hour to encode will take 4 to 10 times depending on the hardware setup... 

https://daredreamermag.com/2012/08/22/understanding-8bit-vs-10bit/

Yeah, for encoders, it is hard job to do the encoding in H.265 being the compression ratio relatively higher than H.264. But for decoders, it is a very good thing if you have required hardwares. As we can see that the CPUs and GPUs are getting more and more powerful eventually, soon H.265 will become the "go to" option for the encoders.

Link to comment
Share on other sites


Yup, but not everyone can join the upgrade wagon.. . on both sides of  the playing field..

Pretty useless collecting H265s if you cant play them on consumer hardware players/STBs/TVs

The point is choosing a vid format that is playable with the current hardware you have...

Less headaches... wait till the dust settles on what format prevails...

 

Quote

 soon H.265 will become the "go to" option for the encoders.

 

Not in the near future for most encoders... 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites


  • Administrator

My problem with H.265 is that sooner or later VC1 is going to come and overtake it and will require new hardware again, frustrating to see this.

Link to comment
Share on other sites


8 hours ago, DKT27 said:

My problem with H.265 is that sooner or later VC1 is going to come and overtake it and will require new hardware again, frustrating to see this.

And why do you think that VC1 will take over H.265?

Link to comment
Share on other sites


Sometimes strange things happen ..
I have a very old pc (at least 10 years) with AMD 1.69 ghz and ram 2.50 gb 32 bit
I always used vlc media player and when I tried to play a coded movie h265 or hevc it did not work.
So without being expert in configurations I thought to try other players and with my surprise I managed to see the same movies with windows media player and smplayer without modifying any parameters.

Link to comment
Share on other sites


I agree.. H265 is not an end... A newer format will surely replace it in the coming years.. and people will say "i need to recode all my stuff to this" ..ad infinitum..

 

Also with Widevine DRM :) will see...

Link to comment
Share on other sites


  • Administrator
On 15/5/2017 at 10:06 AM, rudrax said:

And why do you think that VC1 will take over H.265?

 

Sorry, my mistake, I meant AV1. It's said to be a successor of H.265 and is being developed together by many companies I think.

Link to comment
Share on other sites


5 hours ago, DKT27 said:

 

Sorry, my mistake, I meant AV1. It's said to be a successor of H.265 and is being developed together by many companies I think.

:spank:

Looks good to me as it promises 25% more efficiency over HEVC and is completely free with slightly complex coding.

Link to comment
Share on other sites


  • Administrator
On 18/5/2017 at 9:15 AM, rudrax said:

:spank:

Looks good to me as it promises 25% more efficiency over HEVC and is completely free with slightly complex coding.

 

Free to them, not to us, go buy another hardware for it.

Link to comment
Share on other sites


Update: I am finished talking with the Intel support team and they concluded that it is not possible to provide 10-bit HEVC decoding support through Intel HD graphics update. The iGPU in my N3700 can only do up to 8-bit HEVC.

Link to comment
Share on other sites


Why am i not surprised??

 

 

Spoiler

 

I doubt it if INTEL will release support 10bit HEVC for Braswell ..   as you can see it and its sucessor aren't supported.. (note that the pentium is on the lower end of the spectrum)... So you have to move on to the next tick-tock level.. Part their planned obsolescence 

Link to comment
Share on other sites


Neither am I. Deep down, I knew that it could be a hardware level feature which can not be added by a software update. Yet, I needed to clarify.

Link to comment
Share on other sites


You can do so much... with entry level cpus*. There is always software codec decoding

If That fails in the future time to get yourself a new rig

 

--------------------------------------

* IMHO i don't have high expectations of these 

Link to comment
Share on other sites


1 hour ago, teodz1984 said:

There is always software codec decoding

That I do have already, in fact, every media player does. But it stresses the CPU like hell and that is my problem.

Link to comment
Share on other sites


Then stop watching videos with h265 10 bit on your laptop.. or get a better specced pc if you need to. 

 

HEVC cuts down the video size by almost half. 400MB of H.265 video looks same as a  700MB H.264 video. So it is a very good thing.

 

All this fuss of making Hi10 265 videos are now common in the anime subbing community... They wanted lean sized releases at the cost of taxing the encoder's machine with long encode times.. Most of them think if they adopt the latest video and audio formats, they would be the coolest guy on the block, yet they would only cater to a limited amount of clients who can comfortably play these media on their rigs, mostly higher end pcs.

 

A lot of release groups emphazised just the formats and not the quality of the encodes...  10 bit encoding is only good if you encode from the original lossless source (preferably BRs) where all original info is intact. Encoding is an art, some people are good at it, other make a botched job at it. 

 

TRANSCODING an 8 bit lossy source to a 10bit source.. (which in my book is a waste of time for both encoder and the viewer).. Take too long and doesn't improve on the original except to perhaps shaving the file size... 

 

Lastly what use is a half sized movie, if your hardware can't play it comfortably.... I say stop Keeping with the HYPE.. Use what works on your hardware.. 

Link to comment
Share on other sites


Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...