Jump to content

Latest-ish MPC-HC ported for XP


Recommended Posts

Posted (edited)
On 09 September 2024 at 1:51 AM, Dixel said:

Nvidia drivers are a closed source software, kept in secret. For XP one would need to write new portions of code and insert into them. Not possible at this stage, unless you have a mate working at nVidia to leak the drivers.

Why are you so certain that the nVidia 368.81 drivers do not support hardware acceleration of h265 video under XP?  Have you tried it with a GTX 950 or GTX 969 graphics card, or do you know someone who has?  I am asking because experimenting with these cards would involve me in a time consuming, difficult, and expensive rebuild of my HTPC with no certainty of the result.  To a naive person like me all the elements for success seem to be there.  For instance there are no missing dependencies in nvcuvid.dll. and the 368.1 drivers were built in 11/07/2016, while the GTX 950 was released in 13/03/2015.  nVidia have stated that the 368.81 drivers support these two cards.

The results obtained by D.Draker and Ed_Sin are discouraging.  Using the GTX 750ti card D.Draker observed dropped frames galore, so it did not appear to be offloading some of the the video processing to the GPU.  Ed_Sin did actually achieve full hardware decoding with the GTX 950, but the output was corrupted with artifacts.  He suggests that changing the renderer to something other than EVR maybe necessary.  Possibly successful hardware decoding of h265 using NVDEC requires EVR, which may or may not work under XP, unless you have a backported build of a recent MPC-HC.

Edited by Zorba the Geek

Posted
8 hours ago, ED_Sln said:

I checked the hardware acceleration on a GTX950 in WinXP using LAV. CUVID is indeed there and it works, but with h265 and VP9 there are problems, the video opens and hardware acceleration works, but the video is scattered into many artifacts, it is impossible to watch it, so actually only h264 works, no problems with it even in 4k. But CUVID only supports Nvidia GT630 and newer graphics cards, so the use is very limited. But those who have suitable video cards will be able to get hardware acceleration back on ported versions of MPC-HC and MCP-BE, where there is no VMR.

spacer.png

thats a interesting one , useally this happens due a bad and upscriped code 

going back without mmx and the others 

1920*1080 (in RGB *24) 

RGB = 8 + 8 + 8 (bits)

1920 * 1080 * 24 * 24 (24 frames for smooth video) = 49877400 bits / 8 = byte / 1024 / 1024 = megabyte

that makes 5,9326 megabyte / s * 24 = ~ 142

with a 32 bit tick that makes 37324800  hz = 37 mhz

 

that is actually not to much even without the cache, mmx-avx and cpu technolegys (like quad pumped)

a common cpu is working with predictions it collects the opcodes in a cache then the cpu decides what it physical processes - that makes the cpu a lot faster 

here is also where the 64 bit question came vs 32 bit - in first sence 64 bit might be faster because it can process more bits per tick. but actually the 32 bit cpu can see what is going on - and then is rather limited to a output limit (for the FSB to make an example the 32 bit cpu can not pump 32 bit instead it can pump 128bit (quad pumped) or even more)

 

sure the decoder is not part of it, but some can tell the free room is a lot often we only have 720p , while 1080p are pretty acceptable already

 

 

so i think the reasons have to be found behind that 

if it really gives the data to a grafic card it runs to many modules and code before it even is at that place 

 

the 8 bit (aka 8 +8 +8) is a old discussion too, there used to be the .GIF format question (what only had or has 256 colors)

but 24 bit means 16 millions possible colors, instead of 256 

i remember the discussion about that and 16 millions are by far enough

you dont have that many pixels to differ 

lets say the lines are 1080 p (compared to 16´777´216 possible colors for each pixel) 1080 p is nothing even 8 k would not show the difference (either x or y direction) (16 mill/1080 are 15534 times more colors that even can fit into the 1080p line)

seeing the picture even having 48 bit RGB i think that knowledge might about that have been forgotten

 

some people tend to say that sometimes you have it a bit better for with 10 bit but theoretical thats incorrect

what is correct is that the encoder/process might have not given the amount of pixels a different color) - that can happen 

but physical RGB buffer wise (thats what everybody uses - even displays itselfs use RGB rays not YUV) in RGB 24 bit is by far enough - having that said it sounds weird to me to use 10 bits per color 

 

having that said i think it must be inefficient code, how about you make the decoder ? it actually is open source at x265.com

i do not think the the hardware acceleration question has to lead into something like "because the hardware accelerated picture shutter around" because of that it cant be done with a normal cpu/or XMM

 

rather i heared out that actually nobody even has a precise idea how that "engine" really works (engines can be kinda slow, special scripted and some that jumps around in the OS)

it sounds like the debunked question that you need a hardware de/encoder for h265 to even have h265

 

at the moment it seems that we dont know why it shutters around - if that cant be fixed there would be open source decoders

Posted
2 hours ago, Zorba the Geek said:

Does anyone know what is the highest spec CPU that can be installed under XP?

Ryzen 7 5800X.

On 9/10/2024 at 6:46 AM, Dixel said:

x265 10 bit tends to suppress film grain, H265 8bit tends to over-saturate videos a lot. AV1 tends to over-contrast videos a lot.

Thank you! Finally somebody sad that. I recently was walking in park and thinking to myself that all this "new" formats are creating picture inferior to well tuned h.264.

P.S. I can add from myself that h.265 also create some kind of murkiness in videos and AV1 is insanely slow at encoding in this format.

Posted
14 hours ago, Rod Steel said:

Thank you! Finally somebody sad that. I recently was walking in park and thinking to myself that all this "new" formats are creating picture inferior to well tuned h.264.

P.S. I can add from myself that h.265 also create some kind of murkiness in videos and AV1 is insanely slow at encoding in this format.

You're welcome! Yes, indeed murky. AV1 is murky too, but it tries to compensate with high contrast settings. Murky/blurry video needs less pixels, less pixels mean less file size, simple math. Then one can claim he achieved a new breakthrough!

H264/x264 were the last good formats. There are some rare exception for H265 8bit made with the first encoders, they are somewhat bearable, but not as good as x264.

NVENC was always a piece of blurry garbage for all formats, anyways, the produced result was never sharp.

Posted
17 hours ago, Zorba the Geek said:

Why are you so certain that the nVidia 368.81 drivers do not support hardware acceleration of h265 video under XP?  Have you tried it with a GTX 950 or GTX 969 graphics card, or do you know someone who has?  I am asking because experimenting with these cards would involve me in a time consuming, difficult, and expensive rebuild of my HTPC with no certainty of the result.  To a naive person like me all the elements for success seem to be there.  For instance there are no missing dependencies in nvcuvid.dll. and the 368.1 drivers were built in 11/07/2016, while the GTX 950 was released in 13/03/2015.  nVidia have stated that the 368.81 drivers support these two cards.

I said quite sure, not 100% certain. It's easy to guess because nVidia is famous for making artificial restrictions for older OS in general, a goof example was the artificial rate refresh limit introduced for XP.

Sorry, I don't have GTX960/50. I have a GTXTitan (Jan. 2013). And some time ago found a working 780 Ti in the local dumpster (E-waste).

Both only support 8 bit H265, not 10 bit.

Posted
17 hours ago, Zorba the Geek said:

Why are you so certain

What I am 100% certain of, - MPC-HC 64bit version decodes 10bit H265 faster than the 32bit counterpart, Try 64bit XP, maybe?

Posted
18 hours ago, Zorba the Geek said:

Maybe you should try the Lentoid decoder again on your current Pcs, and share the results with us.

I'd gladly help you, but I don't have a high end CPU or PC. My most "powerful" is Pentium G from 2013? 2014?

Supposedly, I can push it hard (overclock to the max.), but them again, it's not far from that old Quad Core.

Posted (edited)
13 hours ago, Saxon said:

What about the ancient VC1? GT950 doesn't support it, or it's XP's fault?

VC-1 is supported, it's just that it's disabled by default in LAV, I don't have any videos in that format so I didn't turn it on.

11 hours ago, Zorba the Geek said:

He suggests that changing the renderer to something other than EVR maybe necessary.  Possibly successful hardware decoding of h265 using NVDEC requires EVR, which may or may not work under XP, unless you have a backported build of a recent MPC-HC.

I checked in older versions of MPC, where there is VMR, the video also crashes with artifacts. Looks like the problem is in LAV itself, apparently this version doesn't work properly in XP.

I downloaded unofficial LAV 0.79.2, it's much better, artifacts happen after starting the video, but disappear after a few seconds. But hardware acceleration works only in 8 bit h265, 10 bit is not accelerated. To make it work, you need to disable h264 and h265 codecs built into the player, as well as disable mkv, webm and mp4 filters, and add external filters from LAV.

Screenshots:

 

I checked the working in MPC-HC 2.1.7.2.

spacer.png

And in MPC-BE 1.7.3. Video in both cases h265 4k, processor Athlon X2 2.6 GHz, But with hardware acceleration, even 4k video does not load the processor much.

spacer.png

Edited by ED_Sln
Posted
12 hours ago, ED_Sln said:

VC-1 is supported, it's just that it's disabled by default in LAV, I don't have any videos in that format so I didn't turn it on.

VC-1 is disabled by default in LAV?!?!?! Are they crazy? It's the default format for BluRay disks.

How is this possible? Probably, you wanted to say it's disabled in MPC-BE, the enemy of old OS?

I just can't comprehend who would want to compromise the common BluRay disk standard playback...

Posted (edited)
3 hours ago, Dixel said:

VC-1 is disabled by default in LAV?!?!?!?!?! Are they crazy? It's the default format for BluRay disks.

In LAV 0.70 hardware acceleration is disabled for most formats, it's not clear why, maybe K-Lite set it that way, I installed it. And you are confusing VC-1 with AVC (h264), this is the codec used in BluRay disks.

Edited by ED_Sln
Posted

Something is still wrong with h265, after rebooting the system the video crashes again with artifacts, even though I haven't installed or changed anything, neither in the players nor in the system.

Posted
9 minutes ago, Dixel said:

No, I dont,

Most likely it was used at the dawn of the BD era, I've seen a lot of BD disks, and they always use h264, or h265 if the video is 4k. Although, according to Wiki, some studios encode more in VC-1, but apparently I haven't come across such disks.

Posted

About LAV - it seems that because of K-Lite there were such settings, reinstall LAV, cleaned the registry from it, now VC-1 is enabled, but MPEG-4 is disabled.

Posted

Most original Blu-Rays sold in Europe with popular films are VC-1 in ts container, Nightmare On Elm Street 1984 would be a good example.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...