Jump to content

Pimax 8k / 5k VR headsets coming to Kickstarter this month!


Bullets
 Share

Recommended Posts

There is more and more information coming from the new Pimax all the time and to me it still looks very promising. Of course the final truth will be revealed when the devices arrive at customers but so far it's looking good. Very interesting times in VR technology.

Link to comment
Share on other sites

There is way more to building a good HMD than increasing resolution and FOV.

Oculus dump billions on Rift and that hasn't improved the CV1 beyond 110 degrees and 1080 * 1200 per eye.

I really hope the new Pimax is a quantum leap forward but I'm not holding my breath. If you are at kickstarter stage then you aren't even remotely close to the high production quality I am looking for - more like a prototype with all the attendant issues.

My 1080Ti still can't run this title at what I previously considered minimum acceptable levels of detail. Its only the low resolution that is saving it from not running at all.

I have tried SLI 1080Tis but it actually ran worse. Nvidia have their own (sadly proprietary) VR SLI solution called VRWorks using the 10 series cards as a minimum where each eye is rendered on a discrete GPU. But You need DX12 to even use it, and then the developer must actively support it. There is no equivalent functionality AFAIK in OpenVR but DX12 does have limited support for non-SLI multi-GPU acceleration for VR.

 

So even if Pimax 8K is everything we wish for you can't buy a GPU configuration that will run BoX on it.

 

Caveat: Because most of us use supersampling anyway, it may be possible to not supersample and simply project to native screen-sized framebuffers, but this doesn't address the issue that you basically have to render two scenes (one per frustum) every frame. So until BoX uses DX12 or native VR SLI APIs like VRWorks I doubt the experience on Pimax will be any improvement over what we have with Vive and CV1.

 

Note that the real issue with the Rift (at least for my taste) is not its smaller FOV - it is the pixels per radian. It is the painfully obvious gaps between pixels that create the eye-bleeding screen door. In this respect Pimax 5K promises no improvement. Where I expect Pimax to really fall short though is ergonomics. It is clear when you examine CV1 that a lot of thought and iterations have gone into the Rift. The tracking is close to perfect and the comfort, fit and finish are exceptional. Even with its level of refinement however it fails to support 100% of users ergonomically. Minimum inter-pupilary distance for the Rift is about 58.5mm (give or take) and Vive is about 60mm. For plenty of people that is just a bee-pube too wide. Close enough for short periods, but bad for your eyesight for extended play. Another issue with CV1 is there is no adjustable eye-relief. For me this means my eyelashes graze the lenses and it is very annoying. Vive was better in this respect but dimmer in the central region of the displays than Rift.

 

I have more faith in an earlier-than-anticipated release of an incrementally better CV2 than an earth-shattering Pimax. Lets hope Oculus and HTC at least feel some pressure to accelerate their schedules.

 

 

Edit: Where the VR leaders are focussing their attention at the moment is pupil tracking and foveated rendering. Oculus plans to implement this in CV2. HTC are also working on it. Pimax don't appear to be. Neither are StarBreeze. This adds a whole new level of packaging complexity - which is part of the reason for the long wait time for CV2 et al.

Edited by Dave
Link to comment
Share on other sites

SvAF/F16_radek

There is way more to building a good HMD than increasing resolution and FOV.

Oculus dump billions on Rift and that hasn't improved the CV1 beyond 110 degrees and 1080 * 1200 per eye.

I really hope the new Pimax is a quantum leap forward but I'm not holding my breath. If you are at kickstarter stage then you aren't even remotely close to the high production quality I am looking for - more like a prototype with all the attendant issues.

My 1080Ti still can't run this title at what I previously considered minimum acceptable levels of detail. Its only the low resolution that is saving it from not running at all.

I have tried SLI 1080Tis but it actually ran worse. Nvidia have their own (sadly proprietary) VR SLI solution called VRWorks using the 10 series cards as a minimum where each eye is rendered on a discrete GPU. But You need DX12 to even use it, and then the developer must actively support it. There is no equivalent functionality AFAIK in OpenVR but DX12 does have limited support for non-SLI multi-GPU acceleration for VR.

 

So even if Pimax 8K is everything we wish for you can't buy a GPU configuration that will run BoX on it.

 

Caveat: Because most of us use supersampling anyway, it may be possible to not supersample and simply project to native screen-sized framebuffers, but this doesn't address the issue that you basically have to render two scenes (one per frustum) every frame. So until BoX uses DX12 or native VR SLI APIs like VRWorks I doubt the experience on Pimax will be any improvement over what we have with Vive and CV1.

 

Note that the real issue with the Rift (at least for my taste) is not its smaller FOV - it is the pixels per radian. It is the painfully obvious gaps between pixels that create the eye-bleeding screen door. In this respect Pimax 5K promises no improvement. Where I expect Pimax to really fall short though is ergonomics. It is clear when you examine CV1 that a lot of thought and iterations have gone into the Rift. The tracking is close to perfect and the comfort, fit and finish are exceptional. Even with its level of refinement however it fails to support 100% of users ergonomically. Minimum inter-pupilary distance for the Rift is about 58.5mm (give or take) and Vive is about 60mm. For plenty of people that is just a bee-pube too wide. Close enough for short periods, but bad for your eyesight for extended play. Another issue with CV1 is there is no adjustable eye-relief. For me this means my eyelashes graze the lenses and it is very annoying. Vive was better in this respect but dimmer in the central region of the displays than Rift.

 

I have more faith in an earlier-than-anticipated release of an incrementally better CV2 than an earth-shattering Pimax. Lets hope Oculus and HTC at least feel some pressure to accelerate their schedules.

 

 

Edit: Where the VR leaders are focussing their attention at the moment is pupil tracking and foveated rendering. Oculus plans to implement this in CV2. HTC are also working on it. Pimax don't appear to be. Neither are StarBreeze. This adds a whole new level of packaging complexity - which is part of the reason for the long wait time for CV2 et al.

 

Dave, while traditionally GPU's has been the bottlenecks in games, Box and VR is an exception. It's very cpu dependent and more importantly single thread cpu dependent. This is why so many of us can run generous amounts of supersampling. SS does not tax the cpu as much as the gpu. Box just doesn't take advantage of the 4 or more cores most have. (in taskmanager it will appear as if all your cores are only at 25-30% load each). If you skim through the benchmarks thread you will see Chiliwilis graph showing a strong correlation between single thread cpu rating and fps. While Gpu's don't affect the results as much as one would think as long as they are of the 10 series or later 9 series.

 

The increased fov from the pimax will definitely tax our cpu's in Box more than our current 110 Fov. The resolution also but not to the same extent. I would suspect a 1080 to be enough for il2 and the pimax. But it will be a slideshow due to it's dependency on single thread cpu performance. Hoping that once this pimax is out and delivers all it's promises. The il2 devs will also find a way to make Box properly multithreaded and take advantage of the multi core cpu's we already have.

Edited by a_radek
Link to comment
Share on other sites

I'm glad that all this speculation can be stopped in a few months.

 

In any case, it's great to have someone trying to improve poor resolution and sde of the current headsets. Regardless how current gpus can handle it in Il-2. Things are still going forward in VR gaming.

Link to comment
Share on other sites

SCG_Fenris_Wolf

After seeing VRworks first hand , I must say the techs that nvidia delivers are fantastic in both quality and performance. IL-2 really needs this Dx12 upgrade. Only a few techs from VRworks are compatible in dx11, albeit the more powerful performance-saving ones like LMS. It's maddening that AMD missed that train (yet again) and is left far behind in the dust. Competition would be great to push development faster.

 

Pimax 5k/8k won't run fluid in IL 2s current engine anyway. They will have to run with these new technologies or we'll receive a hi-res VR stutter show. Well we can still play on low or balanced hi-res at 90Hz with that fancy gtx1080, but Super Mario 64 graphics beat the purpose don't they ,)

Edited by 2./JG51_Fenris_Wolf
Link to comment
Share on other sites

 

 

I have more faith in an earlier-than-anticipated release of an incrementally better CV2 than an earth-shattering Pimax. Lets hope Oculus and HTC at least feel some pressure to accelerate their schedules.

 

 

Edit: Where the VR leaders are focussing their attention at the moment is pupil tracking and foveated rendering. Oculus plans to implement this in CV2. HTC are also working on it. Pimax don't appear to be. Neither are StarBreeze. This adds a whole new level of packaging complexity - which is part of the reason for the long wait time for CV2 et al.

 

Indeed, that will be a game changer for VR no doubt and we know that Oculus is working on that.

 

In any event, I am still hanging my hat on Oculus for now and plan on staying with them going forward , but I will certainly be watching for reports of the actual consumer devices Pimax comes out with over the next several months. 

 

One thing for sure is apparently both Oculus and HTC are keeping their plans quiet at this time. Not sure if that is a good thing for them or not, but they don't seem to be too concerned at least for now.

 

Competition is certainly a good thing, we all will benefit from it in both technology and pricing.

Link to comment
Share on other sites

=EXPEND=Tripwire

Can we not get BOX to run on more threads/cores?

 

We can't no - but the developers *might* be able to if they can commit a significant amount of time and resources to the task.

It is no simple feat and should not be underestimated.

I'm waiting to see if one of the other manufacturers release a higher resolution screen combined with foveated rendering. GPU power required to run the higher resolutions would be too significant without that technology in place.

Edited by =TBAS=Tripwire
Link to comment
Share on other sites

SCG_Fenris_Wolf

But guys, isn't that FOV rendering based on eye tracking enabled through directx12 api, similar to VRworks? 

 

So for whatever may be released, IL-2 will need dx12 anyway, or did I get that wrong?


So in the end, Dx12 would be necessary anyway. I recommend you read this: https://developer.nvidia.com/vrworks/graphics/singlepassstereo

Allows to render the scene once, not twice. Hence, performance requirements closer to monitor-style gaming, and the doubled CPU load will be a thing of the past. You just need a single instance to be fed into the GPU!

 

 

 

P.S. Works with Dx11 already. 

Edited by 2./JG51_Fenris_Wolf
Link to comment
Share on other sites

 

Dave, while traditionally GPU's has been the bottlenecks in games, Box and VR is an exception. It's very cpu dependent and more importantly single thread cpu dependent. This is why so many of us can run generous amounts of supersampling. SS does not tax the cpu as much as the gpu. Box just doesn't take advantage of the 4 or more cores most have. (in taskmanager it will appear as if all your cores are only at 25-30% load each). If you skim through the benchmarks thread you will see Chiliwilis graph showing a strong correlation between single thread cpu rating and fps. While Gpu's don't affect the results as much as one would think as long as they are of the 10 series or later 9 series.

 

The increased fov from the pimax will definitely tax our cpu's in Box more than our current 110 Fov. The resolution also but not to the same extent. I would suspect a 1080 to be enough for il2 and the pimax. But it will be a slideshow due to it's dependency on single thread cpu performance. Hoping that once this pimax is out and delivers all it's promises. The il2 devs will also find a way to make Box properly multithreaded and take advantage of the multi core cpu's we already have.

 

I am well aware of BoX's poor CPU parallelism but this really has little to do with the cost of rendering to two 4K displays over a single 2K display. Supersampling is not something we do because we can, it is required to get acceptable subpixel-like smoothing on low resolution VR displays and works because of the way downsampling works in our favour on the Vive and Rift DSCs and because our GPUs are capable of rendering most scenes at 4K resolution. SS does fcuk all on the CPU actually. It increases load slightly on the compositor which does use CPU time. But it drastically increases the framebuffer size, taxing the GPU in all phases of the render pipeline including and after rasterization (eg fragment and pixel shaders). It also increases memory bandwidth and transfer bandwidth demands on the GPU. This game is absolutely GPU limited in VR - largely due to the additional frame presentation timing overheads of outputting to VR displays (Apple had a very good presentation on the topic at WWDC17 on the MTL2 track). I currently play at a lower (supersampled) resolution on my Rift than I was previously playing on my 2560x1440 screens and it still runs like shit. Thats with graphics preset to High and all options set to Medium. On my screens I was running on Ultra with every option maxed. CPU utilisation hasn't changed at all.

Honestly I don't care how hard the Pimax will push a GPU because I have another 3 1080Tis sitting on my desk. But the Pimax won't be the issue. The game will because it fails to optimally use two cards in SLI and fails to take advantage of vendor-specific VR support. The single threading is problematic but not for me because even my single threaded CPU performance is more than enough for this game. It would be nice if the game offloaded physics processing to my other GPU too - but it doesn't. It would be nice if the game used OpenCL or CUDA to accelerate parallelizable tasks on the GPU - but it doesn't. Not bashing the game - just telling you how it is.

 

Having just spent a few hours digesting everything I can find on the web about the Pimax 8K, I have to say that some of my reservations with it were due to the shortcomings of the rather underwhelming Pimax 4K, and the 8K seems to have actually addressed all of them. I do like the modular architecture and the development potential that it opens up. I will probably buy one anyway just to see for myself - and maybe to integrate it with my Leap Motion. Its not truly 8K but it is still a much higher pixel density than my Rift. Hopefully BoX will improve to take better advantage of my GPU investment before December rolls around.

 

Edit: I forgot to mention that Pimax 8K being in kickstarter really doesn't indicate immaturity as I suggested in an earlier post. The goal was a paltry $200K. That wouldn't even come close to paying for the R&D of a HMD and was more than likely just a PR play to get the internet talking about their product - which has apparently already been through one production run using Lighthouse V1 tracking hardware. I have read that devs and integrators can even get their hands on some limited stock of the Lighthouse V1-based units which are already warehoused but not going to market since Valve already shipped V2.

Also, the extensible design of the Pimax 8K does already provide for pupil tracking modules which they have demonstrated contrary to my earlier assertion. This will allow software developers another out in the form of foveated rendering of scenes to drastically improve the perception of fidelity in the foveal region for the same or lower GPU load.

Edited by Dave
Link to comment
Share on other sites

Can we not get BOX to run on more threads/cores?

One major hurdle here is that C++ only added explicit language primitives for multithreading in the C++11 standard. I suspect (and this is based mostly on my own speculation given the age of the ROF engine and the development team) that BoX is implemented in C++98 and would need a wholesale rewrite to properly support multithreading.

 

The level of difficulty in writing, and more importantly testing, concurrent code can not be overstated. After doing it myself for over 25 years I still have to be very careful and hurt my brain a lot when implementing thread-safe and performant data structures and algorithms, which is why smart developers don't - we use thoroughly tested and proven language support and concurrency libraries. Well these things are relatively new in C++ (the implementation language for most performance critical games).

Link to comment
Share on other sites

SCG_Fenris_Wolf

Dave, I thought that IL-2 does not just render the scene twice, but also feeds the GPU twice with the geometry and everything else, basically two instances. 

 

With VRworks of Nvidia, there seems to be an easy way to dodge that and only run through everything once using Single Pass Stereo (it's fully implementable in dx11, no need for dx12). The API will split the scene up and cast it onto both eyes. That way you achieve FPS in areas of a single monitor easily, all the VR users here would basically double their FPS and we'll be back to monitor performance. I have seen the transition to the VRworks tech in EVE Valkyrie - the results are amazing, the game felt like it jumped a generation ahead.

 

I think it's the only way right now. And the Ryzen users profit from it as well, they won't require >4 GHz anymore, as long as they use an NVidia GPU (let's be honest and sell that shitty no-VR-tech Radeon walmart card, they are not made for VR, but sell like hot bagels due to bitcoin miners right now).

 

Rewriting the whole code/engine for a different multi-threading isn't anything the developers would actually do anyway.

Edited by 2./JG51_Fenris_Wolf
Link to comment
Share on other sites

@ Dave,

mate really stop posting stuff you don't know about.  Reading your FUD on the topic of VR and Pimax is quite sad at the amount of information you post with little actual factual information as to what is being done by these guys with their HMD or the issues with VR in general. :rtfm:

 

https://www.kickstarter.com/projects/pimax8kvr/pimax-the-worlds-first-8k-vr-headset/comments

 

http://forum.pimaxvr.com/c/pimax-8k

 

Two links for you, now go and read up on the information posted there and what the developers are doing.  Until then, please refrain from commenting on the product of which you have little knowledge about.

 

As for others here, why promote Nvidia VRWorks when we all know that it will be locked down to Nvidia based products and half the people here will miss out on the supposed benefits.  There are other technologies available in VR that are open and multi platform which is better for use as customers and Enjoyers of BoS.  Nvidia has a history of proprietary features that ensure lock in to their products, G-Sync being their latest which places supported monitors $300 to $400 above similar featured FreeSync (Open Tech) and G-Sync is usable only with Nvidia GPU's.

 

As for multi threading BoS series so that rendering is not single threaded CPU limited, there are two options available, DX12 or Vulcan.  In my experience Vulcan has delivered better benefits than DX12 but then most of the DX12 enabled titles have been tweaked DX11 titles and to get the real benefit of DX12 the Graphics Stack of the title needs to be designed with DX12 in mind.  With Vulcan titles this is most likely the case and as a result the results with Vulcan supported titles have been very good.

 

With Il2 BoS series, the Graphics Stack would need a rewrite and I am not sure why they didn't take DX12 into account when they did the move from DX10 to DX11.  Still the 1C developers have done quite an impressive job with what they can milk out of DX11 and its Single Threaded CPU limitation.  Now they can still utilise multiple CPU cores by moving other elements of the game to other cores like AI and Damage Modelling / Physics.  I am not sure how much of this has been done already.

Link to comment
Share on other sites

SCG_Fenris_Wolf

blitze, what you write regarding the Pimax is right.

 

What you wrote regarding VRworks is false. You must consider that only the smallest fraction of users that play with VR are using Radeon graphic cards - we are talking about AMD graphic cards here - the Ryzen CPU is fine. We should recommend those guys to sell their AMD cards, and buy NVIDIA, because NVIDIA drives API technology development for VR. AMD doesn't. At all. AMD's market strategy here is cost leadership. They are not able and do not intend to cut out the high-end market, but serve those who want to save a few bucks while getting good performance while not looking for the latest technologies in drivers, apis, etc, for classic monitor gaming. All they did was slap "VR ready" and "Liquid AMD" on their cards, on PC cases. Using stickers.

 

The market is (fortunately!) not socialistic. If a company misses out on new technology, misses that train, their segment decreases. That tells them to improve technology, or perish, or carve bottom and go cost leadership, or differentiate fields (currency mining).

 

Going VR means that you roll with the best technology. AMD isn't even close to that, they got nothing to show for. They roll classic, low-cost, simple low-price for power. But technology isn't their horse. Those that are left behind in the dust, are left behind in the dust. At least in this VR market segment.

 

Easy as that.

Edited by 2./JG51_Fenris_Wolf
Link to comment
Share on other sites

@ Dave,

mate really stop posting stuff you don't know about.  Reading your FUD on the topic of VR and Pimax is quite sad at the amount of information you post with little actual factual information as to what is being done by these guys with their HMD or the issues with VR in general. :rtfm:

 

https://www.kickstarter.com/projects/pimax8kvr/pimax-the-worlds-first-8k-vr-headset/comments

 

http://forum.pimaxvr.com/c/pimax-8k

 

Two links for you, now go and read up on the information posted there and what the developers are doing.  Until then, please refrain from commenting on the product of which you have little knowledge about.

 

You went off half-cocked - a bit like I did. My earlier judgement of Pimax was based on the 4K which I tested - it was rubbish. I corrected these earlier misunderstandings of the Pimax 8K in post #92 after spending a few straight hours reading every detail available on the Pimax 8K. Thanks for the condescension - the link suggestion would have been fine without the supercilious remark.

 

As for others here, why promote Nvidia VRWorks when we all know that it will be locked down to Nvidia based products and half the people here will miss out on the supposed benefits.  There are other technologies available in VR that are open and multi platform which is better for use as customers and Enjoyers of BoS.  Nvidia has a history of proprietary features that ensure lock in to their products, G-Sync being their latest which places supported monitors $300 to $400 above similar featured FreeSync (Open Tech) and G-Sync is usable only with Nvidia GPU's.

 

As for multi threading BoS series so that rendering is not single threaded CPU limited, there are two options available, DX12 or Vulcan.  In my experience Vulcan has delivered better benefits than DX12 but then most of the DX12 enabled titles have been tweaked DX11 titles and to get the real benefit of DX12 the Graphics Stack of the title needs to be designed with DX12 in mind.  With Vulcan titles this is most likely the case and as a result the results with Vulcan supported titles have been very good.

 

With Il2 BoS series, the Graphics Stack would need a rewrite and I am not sure why they didn't take DX12 into account when they did the move from DX10 to DX11.  Still the 1C developers have done quite an impressive job with what they can milk out of DX11 and its Single Threaded CPU limitation.  Now they can still utilise multiple CPU cores by moving other elements of the game to other cores like AI and Damage Modelling / Physics.  I am not sure how much of this has been done already.

Now you're into territory of which you have patchy knowledge. Only the draw calls in DX11 are single threaded. Here is an intro to multithreading in DX11 if you are interested. Personally I'm not a fan of DirectX at all. Vulkan is my preference too but you will not see BoX running on Vulkan any time soon. DirectX is a much higher level API than Vulkan, which is even lower than OpenGL was. The work to port to Vulkan would be immense but would open the possibility of Linux and Mac ports. Vulkan, Metal2 and DX12 all allow parallel operations in submission of geometry to the GPU. This is a bonus for VR as it almost halves CPU involvement in presentation of frames to the display but it doesn't fix an application underutilising the CPU's parallelism. While Apple-specific this video describes this and other VR optimisations in Metal2 (some of which I think were borrowed from Vulkan). Incidentally Valve are development partners on the Metal2 framework and learning about it offers insight into modern rendering techniques targeting VR generally.

 

FWIW I don't think AMD are significantly behind on VR support in their GPUs. They worked very closely with Valve and Apple (who actually are behind) in development of VR support for macOS and the Radeon Vega 64 kicks the shit out of my 1080Ti in raw processing power. Vega provides hardware support for single-pass stereo, multi-res shading and VR across multiple GPUs. Aside from enabling-architecture on the GPU most of this stuff is in software anyway and Vulkan is very well supported on Radeon. Stuff like LMS (well at least lens-matched scene masking) is already in SteamVR. Is the Vega64 the equal of a 1080Ti in VR? Not quite but the difference would not be enough to make me sell it and buy Nvidia. I bought the 1080Ti mostly because I have only bought Nvidia for years and felt that game support for Nvidia was very slightly better. I pay for that choice when dual-booting to macOS though.

Edited by Dave
Link to comment
Share on other sites

AFAIK the events are open, so anybody with a negative review could easily make themselves heard.

 

 

 

One is a review, the other one is a fact sheet analysis by a renown fan of Oculus. His analysis is interesting, but take it with a grain of salt.

 

For example, did you know the arrangement of subpixels is such in the Rift that blue and red subpixels are shared between pixels? This means the 1200 lines are actually only 600 lines if you look only at the non-green components. That sounds worse than it is in practice, though. In these matters, experience is important, you can't just look at numbers.

 

I have fears when it comes to usefulness for IL-2. It will probably require code changes to work with the two non-planar projections of the 8k, and the wider FOV may be more than CPUs can provide at the moment. They are already on their knees with 110 degrees, 200 is almost twice that.

I would like to say that I have ran IL 2 at 11520x2160 at a pretty steady 60fps online.  Not sure what that FOV equates too in degrees but at least it gives hope for those with good video cards.  I am doing this with 2- 980ti's so 1080ti may do the trick for our game.   If we could get SLI to work in VR that would be the golden arrow for VR(flight sims) in my opinion.

Link to comment
Share on other sites

I would also like to add with the exception of 6dof the Pimax 4k updates have actually made it run rock solid as of late.  Like I said until Nolo and Pimax fix their issues I do not have 6dof but it is not a game killer by any stretch.  Its slightly awkward when you lean forward and the view doesn't change but its makes up for it with a steady frame rate and crisp picture.  The Pimax 4K has come a long way and I am looking forward to what the 8k will offer especially with the 6dof I cannot see anything better from the other proven companies.  Just my 2cents as the 4k seems to get a bad wrap sometimes and I find it pretty good.

  • Upvote 1
Link to comment
Share on other sites

SvAF/F16_radek

This game is absolutely GPU limited in VR - largely due to the additional frame presentation timing overheads of outputting to VR displays.

 

 

..The single threading is problematic but not for me because even my single threaded CPU performance is more than enough for this game. 

In my previous post I stated It's our cpu cingle thread performance currently limiting us in BoX. And while trying to understand I still don't know why you keep insisting it's not the cpu.

 

https://docs.google.com/spreadsheets/d/1gJmnz_nVxI6_dG_UYNCCpZVK2-f8NBy-y1gia77Hu_k/edit?usp=sharing

 

A glance through this spreadsheet (despite the limited amounts of benchmarks done) does reveal slower single thread cpu's with top end GPUs lag far behind.

Link to comment
Share on other sites

What an utter load of Nvidia biased nonsense, why don't you think before you post.  AMD are not significantly behind Nvidia in VR related technology for their GPUs (I've used GPUs from both AMD and Nvidia for VR).  AMD along with Nvidia AMD are very much at the forefront of pushing GPU technology in VR.

 

This review for RX 480 is a bit old but it clearly shows that even lower end AMD GPUs are more than capable of delivering good VR performance.

https://uploadvr.com/amd-radeon-rx-480-review-vr-ready/

 

You state that "only the smallest of percentages of people use AMD for VR", what exactly is the smallest of percentages?  Is it 1%, or 0.0000000001%, or even less, or perhaps you are just making numbers up out of thin air?

 

AMD have ~25% of discrete GPU share (not marketshare but actual owners).  Let's assume 100,000 people bought a piece of software, would you deliberately alienate 25,000 of them?  1/4 of potential existing BOX (and potential future) owners and is anything but the smallest fraction.  I imagine if your wages were 25% short you would not shrug your shoulders and say "sure it's only the smallest fraction".

 

No developer, or indeed any business would deliberately p**s off 25% of their potential customer base.  "Sorry all BOX users planning to use VR, it's for Nvidia users only, if you have an AMD GPU it's your own fault".  Yep, that would go down really well with ~25% of us here.

 

 

Im using a 580 and the results are not bad at all!

Link to comment
Share on other sites

I would also like to add with the exception of 6dof the Pimax 4k updates have actually made it run rock solid as of late. Like I said until Nolo and Pimax fix their issues I do not have 6dof but it is not a game killer by any stretch. Its slightly awkward when you lean forward and the view doesn't change but its makes up for it with a steady frame rate and crisp picture. The Pimax 4K has come a long way and I am looking forward to what the 8k will offer especially with the 6dof I cannot see anything better from the other proven companies. Just my 2cents as the 4k seems to get a bad wrap sometimes and I find it pretty good.

I can't wait!

Link to comment
Share on other sites

To my best understanding Pimax is compatible with Steam VR and BOX VR runs through that. So in theory BOX is compatible. Whether there are any distortions remains to be seen when someone actually tries BOX with Pimax 5/8K

Link to comment
Share on other sites

Any of them that run through steamVR should run on the pimax 5 and 8k as well. Pimax has already done a demo of DCS World though on the 8k, so that one for sure.

Link to comment
Share on other sites

SCG_Fenris_Wolf

Actually, I am biased towards 3dfx. But since those guys missed the tech train and succumbed, I have been biased towards new technologies - that allow us to fix IL-2's overly strong hunger in VR, especially that of a CPU.

 

ICDP, I know you are an AMD fanboy, and you can explode all you want, but it won't change that AMD is not bringing any proper technology to the table. They bring raw power that can compare with Nvidia's GPU, but no fancy stuff like Single Pass Stereo and Lens Matched Shading. If the guys from AMD were able to deliver the same stuff, that'd be great.

 

Have you tried EVE Valkyrie on a GTX10x0? Answer me that one at least. Instead of spitting insult towards me, barren of arguments, like "do you even think before you post", which is embarassing to yourself only.

Do you see how incredibly superior it is graphically to IL-2 (and other titles as well) due to NVIDIA's VRworks API, with much lower demand on your hardware?

 

 

P.S. I am a fanboy of IL-2 in VR, and I want it to be better. And I am a fanboy of technologies. I couldn't care less which graphicscards company's name it is.

Edited by 2./JG51_Fenris_Wolf
Link to comment
Share on other sites

Which flight Sims are compatible with Pimax 5k and 8k?

 

As Black_Sab and Bobflex said, any HMD that supports SteamVR will work with all Steam VR compatible software.  For sims.

 

BOX

DCS

Asseto Corsa

Project Cars (1 and 2)

Elite Dangerous

 

Among others.

Link to comment
Share on other sites

Don’t resist, come to our side :P

Help us get that juicy 3M stretch goal :P

Don’t resist, come to our side :P

Help us get that juicy 3M stretch goal :P

Don’t resist, come to our side :P

Help us get that juicy 3M stretch goal :P

  • Upvote 1
Link to comment
Share on other sites

Don’t resist, come to our side :P

I am really tempted but there's a lot of money involved and I am not sure if the 8k would be the best for flying sims. From what I understand, both 5k and 8k have the same native resolution, but the 8k has a better screen and a scaler. I wonder what will that do to spotting. I don't doubt that for general gaming the 8k would look nicer, but if you take one pixel, run it through the scaler and display it on a screen with bigger dpi, you might be at a disadvantage. I think it would be prudent to wait for some reviews for the Pimax 5k and 8k and how they work with IL2.

Link to comment
Share on other sites

SCG_Fenris_Wolf

While it is very tempting, after receiving a dead GPU (thanks Alternate), two malfunctioning 34" monitors (replaced by Amazon), a not working lighthouse station (sent back whole lot), a broken Warthog Hotas, and having a pitched Rift (temperature gyro dmg after 4 weeks, RMA), all within 18 months - I am not very inclined to donate to a company.

 

I will wait for the Pimax 8k in EU sale with an EU-based vendor, and recommend everyone to do the same. If anything breaks or arrives effed up, what are you going to do?

Donating to a company instead of buying a product, you have no legal leverage whatsoever.

 

Before anyone mentions it, yes, I did throw salt over my shoulder. It didn't help :D

Link to comment
Share on other sites

I understand it’s sort of a gamble but the again, they are putting a lot of freebies in the Kickstarter pack, soon will come a free eye tracker, and if it work as it’s supposed to, will be a great addition to any game.

Moreover, they stated that they will send the customer version, not some mass produced prototype, so that is a plus. And probably it won’t be on the market till at least June or July, maybe even later, so that’s another plus.

 

I confess that I’ve had my doubts, but the reviews have been really good overall, so I’m decided to go forward with it.

And warranty is indeed a problem, but they have stated that it will have 1 year and will be sent from eu warehouses, so that kind of softens the fears...

 

On another note, the fov might be what’s lacking for vr in il2. How hard and painful is to check six with the rift... flew my first multiplayer sorties today at WOL, and it was bad! I’m a newbie, have had some success against AI, but online, flying alone, I got bounced so much without having a way to normally see behind me, that seriously frustrated me! Pimax might ease that!

Link to comment
Share on other sites

SCG_Fenris_Wolf

Is there a way to buy a single lighthouse station for positional tracking , I wonder?

 

I don't need controllers or anything else, I'd just use this for flying.

Link to comment
Share on other sites

Is there a way to buy a single lighthouse station for positional tracking , I wonder?

 

I don't need controllers or anything else, I'd just use this for flying.

Yes, you can do like me and add 75$ to the pledge. 3dof our of the box and 6dof with one base station

Link to comment
Share on other sites

Yes, you can do like me and add 75$ to the pledge. 3dof our of the box and 6dof with one base station

 

Yep, that`s what I did, too. But you have to add additional 10$ for shipping. ;) You will get a survey from the developers at the end of the campaign, where you can choose from what to do with your additional pledge amount. From their Q&A section:

Can I add an additional controller/base station in my current package?

 

Yes, simply add the additional amount to your pledge:

base station: $75/each + $10 for shipping

hand controller: $100/each + $10 for shipping

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...