• AnyStream is having some DRM issues currently, Netflix is not available in HD for the time being.
    Situations like this will always happen with AnyStream: streaming providers are continuously improving their countermeasures while we try to catch up, it's an ongoing cat-and-mouse game. Please be patient and don't flood our support or forum with requests, we are working on it 24/7 to get it resolved. Thank you.

nVidia Shield and Windows 10

SD_J-I_88

Well-Known Member
Thread Starter
Joined
Apr 13, 2016
Messages
150
Likes
10
Hey is it possible that the nVidia Shield can access my Windows 10 PC that has all my mkvs and 3D BD isos and play them back on my TV or Beamer?
 
Yes to mkv. No to iso. And definitely not anything 3d. There are multiple ways of going about it. I use plex which works very well. Otherwise you can mount a shared folder as an smb share in the storage settings on the shield. After it's mounted, applications like kodi can load files from it.

Sent from my SM-G975U using Tapatalk
 
No to iso.
Sent from my SM-G975U using Tapatalk
Well, it does iso, if the iso isn't ... "demanding". "BD-Lite" (no menus) usually works. Or AnyDVD speedmenus. Let's say "sometimes" to iso. :)
 
Well, it does iso, if the iso isn't ... "demanding". "BD-Lite" (no menus) usually works. Or AnyDVD speedmenus. Let's say "sometimes" to iso. :)
OK that's fair. You'd have to use vlc or maybe kodi for that though.

Sent from my SM-G975U using Tapatalk
 
all right i always stored the 3d isos without anything but the main movie and captations by using the mpls with tsmuxer.

so isos is okay but no 3d stuff?

i’ve some videos on yt about the smb server but they said you have to use the smb version 1.0 which has been deprecated by microsoft a few years ago because of security issues?
 
Yea they're not wrong. Another way would be to set up a DLNA server on your windows box and serve them up that way. Kodi can connect to it through DLNA.

3D stuff is not supported on the SHIELD at all. It never has been and given nVidia's dropped support for it in the nVidia GPU drivers, they never will.
 
yes you are right about the 3d support, that’s why i had to force windows not automatically download newer geforce drivers. :(

ok than i’ll try the shield when dolby vision support to mkv is more stable :).
 
i’ve some videos on yt about the smb server but they said you have to use the smb version 1.0 which has been deprecated by microsoft a few years ago because of security issues?
The SMB1 restriction is only, if you use the Shield as an SMB Server. (And there is a beta version with SMB3 support)
If you use Kodi, you shouldn't habe any problem accessing SMB3 shares.
 
yes you are right about the 3d support, that’s why i had to force windows not automatically download newer geforce drivers. :(

ok than i’ll try the shield when dolby vision support to mkv is more stable :).
Maybe you should take a look at the OSMC Vero 4k+. And Dolby Vision is highly overrated anyway.
 
Thx James, I did even know that this brand exists.

I don't have DV compatible Hardware but maybe in the future.
I read some comparisons about HDR10 and DV and yes it is not that different.
Ripping the main movie with CloneBD should preserve DV for later conversion to .mkv?

I don't think that DV will find its way into Beamer hardware because of the high range of brightness and watching movies on a TV is not that cinema like in my opinion.
 
Thx James, I did even know that this brand exists.
Cool little device. But not really useful for .iso playback.
If you want .iso playback with 3D, you could go with a fake Oppo.
The Shield is the best alrounder, with Kodi, Neflix, Amazon Prime, all the Android TV apps. Very useful, if you have a projector and no "smart" TV. Maybe you want to watch a soccer game from time to time. Or Youtube....
But no 3D or .iso.

You could use a Shield TV *and* a fake Oppo for 3D and isos. :)

I don't have DV compatible Hardware but maybe in the future.
Or you could simply ignore all the marketing bla and live without it.

I read some comparisons about HDR10 and DV and yes it is not that different.
It often is "different" but not necessarily "better".

Ripping the main movie with CloneBD should preserve DV for later conversion to .mkv?
Yes.

I don't think that DV will find its way into Beamer hardware because of the high range of brightness and watching movies on a TV is not that cinema like in my opinion.
IMHO any form of HDR on a (non three-laser "Dolby Vision" or IMAX digital projector, or a LED wall) is a waste. With a real-life on/off contrast of 10000:1 (ANSI contrast around 100:1?) the dynamic range is not really "high" but rather limited. Even if projectors were able to output the high brightness (1000 nits from the screen?), the picture wouldn't go "dark" anymore. That makes the "H" in HDR projectors for the home quite a joke. IMHO HDR is not suitable for (home) cinemas.
Don't get me wrong, I love my projector setup with 3D sound and whatnot. But certainly not because of "HDR".
 
It depends on your environment as to whether DV is useless or not. For my OLED, DV makes a marked difference over HDR but not for the reason most people think. HDR on my C8 has the tone map curve set to 700 nits max, even if the display is capable of more. E.G. my C8 tops out around 800 or so nits. But HDR can't make use of the extra 100 nits. DV can. So even if you ignore the dynamic metadata of DV, which is arguably better or worse than static HDR metadata depending on how it was graded, that extra range is useful. For something like my SHIELD, having DV available is a noticeable improvement for UHD rips and more so on streaming services.

Now, that all being said, if you're doing beamers, why are you even contemplating any of this? LOL The best solution already exists for that world and it ain't HDR or DV. It's madvr running on a high end RTX card! That can dynamically tone map your HDR sources to a range your beamer can work with. Some of them are dynamically tone mapping down to 50 nits. Personally I find that too limiting but the results they're getting are nothing short of remarkable. You get a beamer with 110+ nits and you can do some freaking magic with madvr. Even on my OLED, HDR dynamically tone mapped to 700 nits compares very favorably to the best DV picture I can get out of the SHIELD or my UB820. If I had a projector, it wouldn't even be a discussion.
 
i'm hoping to upgrade my rig at the end of the year to the top of the line RTX 3000. When i ordered my current 1080A i had initially ordered a 1080 OC (just a tad higher and just below a TI) but it never came in stock at that retailer. Never had the most high-end consumer card available (titans arent really meant for consumers) in all my years of system building. Really wanna get the mighest card available this time. So hopefully the 3080 TI, or even the rumored 3090. But we'll see :)
 
If I had a projector, it wouldn't even be a discussion.
True. But I'm a little too lazy. And the projector does a "good enough" job to ... "undynamic" the "high dynamic" range so it looks fine.
And after Nvidia kicked out 3D, I still need a fake Oppo or something similar. Yes, there are workarounds like the "BringBack3D" tool, but how long will this work...?
 
I'm getting a 3080ti when it's out in September. My 2070 is good, but, I still have to compromise.

As for being lazy, James, sure but once you set it up and tweak the crap out of it to find the optimal settings for your HT environment, you don't need to touch it again. They've changed the HDR API in the latest drivers, so, my HTPC is not running the latest and greatest which means the 3D tool, if I had a 3D capable display connected to it, would work just fine. This isn't the gaming world where you need to stay on the latest graphics driver every time they release a new one. Find what works and leave it alone except for critical security updates.
 
i skipped the 2000 series, not enough performance different between my 1080A. Now the 3000 series should blow any 2000 series out of the water by an easy 20+ % even more from what i read. I'd love to move away from a dual monitor setup and go 1 WQHD (at 4k maybe)or whatever its called as primary monitor and then the 2 27" (1080p) ones op top and go triple monitor. Don't think my 1080A would like that
 
I want the 3080ti for 2 reasons. 1 is I do game, and I want it for cyberpunk 2077. And I want the horsepower for madvr tone mapping and upscaling. People who've never seen what madvr can do for upscaling have no real idea. "But my 300 dollar UHD player upscales!" Yea....no, not even remotely close. There's a reason the Envy is going to cost 12 grand. I want some breathing room on the high end in case madshi decides to release some new stuff for the HTPC after all.
 
cyberpunk, you and me both! Maybe once my old plasma dies i'll look into Plex and HDR or setup a small HTPC to feed directly into my TV.
 
cyberpunk, you and me both! Maybe once my old plasma dies i'll look into Plex and HDR or setup a small HTPC to feed directly into my TV.

That's what I have today, I made a 5 HDD Storagepool with Win 10 where my movies are and play them with either VLC for everything except 3D. 3D.isos are played with PowerDVD. But sometimes 3D does not work 'cause PowerDVD has some problems, or Audio will not play, 'cause I opened a yt video with firefox and had to switch the audio source to HDMI again and so on, or a new Win Update messes up your configuration.

That's why I looked into a device, like the shield or the OSMC Vero that gets the .mkv or .iso data and just does a always functional smooth playback and does not struggle with changing "fps" ar whatsoever.

As for displaying, I have to admit that HDR on a (OLED) TV looks stunning, but for movies I like the beamer /projector 24fps best 'cause of the cinematic look, that's not a rational thing...
I wanted to preserve the DV layers so that maybe in the future if there will be hardware I don't have to convert the stuff again.
For 3D I like watching movies like prometheus or colorful stuff like the LEGO movies, but I have to admit that 3D is dying and no one knows how long it will be supported by beamers/ projectors.

Another question:
I have a Marantz SR6012 that upmixes audio to DTS:X and Atmos and I have to admit that I don't really can tell a difference between original mixed DTS:X/Atmos and "upscaled" to DTS:X/Atmos sound. Do you agree?
 
Back
Top