VR News


Yup, I agree with all of your hands.

But. :slight_smile:

Because it is really hardware upscaling on the device rather than actually taking in true 8k (which would need two DirectPort connections, i.e. a weird dual GPU thing that wouldn’t have common games engine support) then it’s possible that it is actually just similar to how people run DCS today using a Pixel Density of 2.0.

That plain old 1080 would be good enough, and it’s not like you’d have to get that just for Weird Bet VR alone.

(Hmm, I think I have become a carrier of the virus known as Rixian Paulus - send antibiotics stat!)


I’ve got a 980Ti, so I missed the big gains of the 10 series. I’ve been holding off on upgrading until, well, I see a VR set that fixes my issues with VR :smiley:


With luck the cryptocurrencies (mainly Ethereum plus derived ones) will crash out due to China exchanges clamping down and the price of GPU cards will readjust a bit. For Pascal, 1080ti is going to be about it I think. For desktop Volta, dunno, maybe in the spring 2018? Not seen a lot of rumors. The AMD Vega’s I don’t know so much about (@SkateZilla?) but I get the impression they are priced around the Pascal levels (plus coin’ers love em which pushes up price).


Do you really want me to open that Pandora’s box Chris? :wink:


Radeon Vega 64 is Priced around 1080 Founders Editions, they trade blows, but the Vega Series is still way behind in terms of Power Consumption.

Radeon Vega 56 is Priced around 1070 Founds Editions, and trade blows w/ 1070 DE, but again, compared to 1070 has Power Consumption issues.

Now, give AMD 2 Quarters to work out fab quality and the voltages and power consumption may drop, it does seem like the Readeon 64 was out of the box pushed to limits to get it on the level of the 1080 FE,

But you also have to remember the Vega 56/64 Chips are almost a year late to the market due to HBM2 Memory Chip Shortages and Delays.

So Vega 10 by itself was basically Polaris w/ some Tweaks and HBM2.
Vega XX will likely be the next step early next year w/ Die Shrinks.

Navi will integrate the same core design element they currently use on the Ryzen Processors.
Basically instead of trying to fab. and produce very large dies w/ reduced yeilds due to defects at high cost.
(cough nVidia Titan/1080 Ti’s cough cough).

AMD will produce Smaller more affordable dies, and link them together w/ the Next Gen Memory System on a Infinity Fabric Style Interposer.

Your System will see it as one GPU, when in fact it will be 4-8 separate smaller GPUs Linked together.

This alone would reduce costs significantly as the larger dies are simply getting too expensive to fab.

Power Wise, w/ new memory and smaller fab. nodes, it may catch up to nVidia or it may not.

As for Mining the RX Vega 56/64 Cards come in just behind the 1080’s and only get stomped by the Titan XP.
But the Power Consumption is not the same.

RX480/580s still win Power:Hash Rate Ratio for AMD.

ANd those lose to the 7970 Late Editions (GHz, BE, etc etc).

My MSi Lightning R7970 still sells on the 2nd hand market for $1000 Plus for a GCN 1.0 Card that’s 7 years old now.


Has anyone benchmarked this to see what it does with DCS?


If I understood correctly, this is future talk - the AMD GPU generation after the next.


This is what will come with Navi, next year 2018.

as far as DCS and DX11,12 etc are all concerned nothing changes.

GPU’s Now, are already distinct Shader Groups Divided into sections with subprocessors and caches etc.

the difference w/ navi is instead of cramming everything onto a huge GPU they use 4 Smaller ones linked on the interposer w/ HBM and New Memory Interface.

So Using a General Reference of say $1000 Per 1000mm^2 wafer for production:

A Huge 4096 Core, 500mm^2 GPU would have a Cost of say $250 a Pop,
A Smaller 1024 Core, 250mm^2 GPU would have a Cost of Say $62 a pop.

Company can take the $62 Chips and Link them and Create the same 4096 Core Processor on the Interposter and save $2 a Pop, and Still be able to Scale the Chips to fit the needs of their entire Graphics Card Family/Range
Scaling from 1 @ 1024 to 2 @ 2048 to 4 @ 4096 to 6 @ 6144 to 8 at 8192.

They would only have to pay to fabricate ONE Chip design, and not have to lease multiple production lines to run multiple wafers.

This doesnt even take into the Money saved due to imperfections in the wafers causing defects and low yeilds.

So, Say Each wafer has 2 defects in it, on a 1000mm^2 wafer.
Depending on locations,
On a 500mm^2 Design, that can render 1 chip or 2 chips defective. that’s 25-50% of the yields. or 2 of 4 fabbed chips. ($250 to $500 lost to defects)
On a 250mm^2 Design, that can render 1 or 2 Chips defective or 6-12%, of the yields, or 2 of 16 fabbed chips. ($62 -$100 Lost to defects)

now extrapolate that to the real life cost of a 14nm Production Wafer which is somewhere around $5-6K Each depending on Size (1200mm^2),

Then extrapolate that to how many runs the company does to meet supply demand.

you’re looking at thousands of wafers.

if you could minimalize how many Chips you throw away to wafer defects and poor fabrication yields. You would not only save production costs, but you’d be able to sell the product cheaper.

hence why Ryzen is significantly Cheaper than the Intel i9s etc, the 8Core 16t chips arent Fab’d as 8 Core 16T Chips, they are 4 core 8T Chips Linked on an Infinity Fabric Layer. before being mounted to a CPU Substrate Package.


Talos Principle in VR coming Oct 17th.

I really enjoyed The Witness, but didn’t try this similar sounding one. It looks like they completely redid it for VR.



Samsung’s VR entry via Microsoft. No external sensors (on-board cameras), 2880x1600 @ 90 Hz OLED resolution with 110 FOV (compared to Rift/Vive’s 2160x1200, so a 77% increase). SteamVR compatibiity upcoming (allowing you to use Vive controllers with it). $499.

Might be a good Vive upgrade path (as you can keep using the roomscale controllers you have) or a sit-down Rift flight-sim / driving sit-down replacement due to the higher resolution and ok price.



Fascinating. So, if I understand this correctly, one could play anything from the Steam store, but not necessarily Vive titles, and definitely nothing from the Oculus store?


Vive titles are all SteamVR, so you could play all of those. It’s more the physical restrictions of using the controllers that come with it, as in if a title relies on you putting your hands up above your head then the sort of tracking that comes with the Samsung will struggle (as it’s using cameras mounting on your HMD rather than external wired cameras or lighthouses). If you have a Vive already then you should be able to use your existing Vive controllers with this headset and get the best of both worlds. SteamVR can mix and match like that.

As for the Oculus store, the Revive ‘hack’ of making your HMD appear as a Rift will allow you to at least run your existing Oculus Store titles on a new SteamVR headset. It is a hack, but Oculus have said that they are not that interested in trying to disable it. Also, Oculus/Facebook might become a ‘multi-platform’ vendor one day (as in, it seems inevitable if they are to be relevant, i.e. join the SteamVR party under a new open standard called OpenXR within their own store).

Microsoft will, I expect, also be offering their own store as well, and will do Microsoft exclusive titles I’d imagine (Halo, Forza etc).

The Samsung and the Pimax show that the ‘higher resolution as a selling point’ is where we are at with VR though. The $500 peripheral point seems to be the target as well.


Having the cameras on the HMD shouldn’t be a problem for today’s sims, though, as they don’t really utilize the hand controllers. I had to get a second sensor to optimize the head tracking on my Rift, as parts of my pit block the view of a single camera.

Very curious how the visuals (and performance) compare!


Yep, me too. As a lot of people run DCS with over-sampling anyway, it’s not like this is out of the range of running nicely on a 1080 etc. at native resolution. The Samsung display panels are meant to be really nice and this could be a real step up for especially combat flight sims.

One unknown is how well implemented the display driver will be in terms of SteamVR support. The lack of ASW (as in, spacewarp, rather than ATW) in SteamVR versus what the Rift has already might be an issue. Because DCS drops down to 45 fps under high res and load, then getting it to run smooth via this SteamVR interim driver way will be a make or break for me.



Agreed. I know I’ll appreciate the FOV increase, but the higher resolution is what I’m really looking forward to. Having to use labels and the VR Zoom feature is an immersion killer for me, but currently a necessary evil.


Very interesting, looking forward to some practical tests. If this is a noticeable improvement over the Rift, I am willing to upgrade to Samsung. Oculus really needs to step up with the CV2 (and I hope it will not all be about being wireless, because I have zero interest in roomscale VR).


A review already. I’ll fill in with notes a bit later to save people having to watch, or something :slight_smile:


So this video is more about Microsoft ‘Mixed Reality’ rather than the Samsung device explicitly. They don’t really talk about the resolution increase that much, more just Microsoft’s odd strategy of only allowing their platform/store with their own devices. The no external sensors tracking looks convenient and decent enough, but what I’d really like is a more racing or sim focused review just on the Samsung HMD. As they are out soon (November ish) then not long too wait.


Thanks - have to watch that over breakfast tomorrow. Seems to me, so far, that I will be using my Rift CV1 for a few more years yet. To justify the outlay (for which I had to cancel my bike’s 48k servicing!) I need to have more than 18 months or even two years of use out of it - and secondly, all the new stuff that would seem to be able to improve on what I have in my CV1 (if projections are accurate) would mean buying a new 1080 as a minimum as well - and here in the Uk those things are very expensive, not to mention the fact that I bought 3 top end graphics cards in 2 years, whereas normally one would last me for 3.