Jump to content
Bullets

Pimax 8k / 5k VR headsets coming to Kickstarter this month!

Recommended Posts

I will not pay serious attention to any of the hand selected Kickstarter backers posting their thoughts/reviews on these early Pimax units.

I prefer to wait to when many more "production" headsets are in the wild and I see feedback from just regular users on various forums.

Share this post


Link to post
Share on other sites

Looking at that latest sweviver dcs video I can say the fps he is getting (on a top of the line rig) would not be acceptable for me in il2.

 

Can any dcs VR player tell me how dcs compares to il2 in performance? Is it equally cpu-bound? One main thread? And are the settings sweviver is using the Dcs equivalent to our “low”? Looked like very low resolution ground textures and no shadows at all either in cockpit nor terrain.

 

Share this post


Link to post
Share on other sites
1 hour ago, SvAF/F16_radek said:

Looking at that latest sweviver dcs video I can say the fps he is getting (on a top of the line rig) would not be acceptable for me in il2.

 

Can any dcs VR player tell me how dcs compares to il2 in performance? Is it equally cpu-bound? One main thread? And are the settings sweviver is using the Dcs equivalent to our “low”? Looked like very low resolution ground textures and no shadows at all either in cockpit nor terrain.

 

 

Using fpsvr, a must buy at £3, I’m seeing the same as sweviver in DCS; both cpu and gpu are only running at 50% utilisation. In IL2 gpu is running at 100% utilisation and cpu at 50’ish % utilisation. My rig is an Odyssey (ss 150%), oc 4Ghz 4690k i5, 8Gb ddr3 mem and a gtx1080.

 

There's something wrong with Dcs as it’s not really stretching either the gpu or cpu. IL2 and DCS are running at about 40-50fps, but dcs stutters a lot more...iL2 is a lot smoother, and this makes it the better vr platform at the moment IMHO

 

any idea when the GTX3080ti will be released? We need better and faster graphics cards, the rtx2080ti just doesn’t seem a worthwhile upgrade...

Edited by 1./KG4_ReggiePerrin

Share this post


Link to post
Share on other sites

DCS players are saying there has been a massive performance drop since 2.5. They were able to get 90fps before (I was too with a rift and a 980ti when I tried the demo half a year ago) and now just trying to keep it close to 45fps.

 

IL2 will be benchmarked by sweviver too.

Share this post


Link to post
Share on other sites

Thanks guys, gives a somewhat clearer picture as to how dcs/il2 compares.
 

32 minutes ago, neelrocker said:

IL2 will be benchmarked by sweviver too.

Very much looking forward to that.

Share this post


Link to post
Share on other sites

IL-2 has always run a much better frame rate than DCS. As well as better fps, it looks so much nicer. If you want good fps in DCS, then only look at the sky. But then you will be confronted with billboard clouds which plainly look silly.

 

My 2080Ti is out for delivery. So will be testing this tonight.

Share this post


Link to post
Share on other sites
16 minutes ago, VBF-12_Stick-95 said:

 

Nice!  If you don't mind me asking, which brand did you go with?

I got the Founder's Edition. I've got a mITX case, so this with water cooling should be nice.

  • Thanks 1

Share this post


Link to post
Share on other sites
17 hours ago, scrapmetal said:

I got the Founder's Edition. I've got a mITX case, so this with water cooling should be nice.

Says the guy calling himself "scrapmetal".

Share this post


Link to post
Share on other sites
On 10/4/2018 at 9:45 AM, 1./KG4_ReggiePerrin said:

Using fpsvr, a must buy at £3, I’m seeing the same as sweviver in DCS; both cpu and gpu are only running at 50% utilisation. In IL2 gpu is running at 100% utilisation and cpu at 50’ish % utilisation.

 

I think that Sweviver, despite of running a 8700K (Don´t know a what freq) he is also limited by CPU. And the CPU% load number is meaningless as usual.

There is another number displayed called CPUmax/thread which I dont know what it does mean.

 

I have to try that fpsvr tool.

On 10/3/2018 at 9:00 PM, dburne said:

I will not pay serious attention to any of the hand selected Kickstarter backers posting their thoughts/reviews on these early Pimax units.

I prefer to wait to when many more "production" headsets are in the wild and I see feedback from just regular users on various forums.

 

Fully agreed. It is true that these youtubers are always quite enthusiastic about that. It is always good to read all reviews, like the hipercritical spanish reviewers and the a bit biased Tom´s.

What is really surprise me from Tom´s is that they consider the VivePro superior to the Rift!!!. I had the VivePro and I can tell you that it is not better than Rift overall. (my review is in this forum). Tom´s don´t even mention the small sweet spot of the VivePro!!.

Tom´s review said that Pimax is not giving any tech advance (like eyetracking+FOV rendereing, varifocal etc), yes, taht´s true, but guys! they deliver exactly what most of us want right now (FOV and resolution).

 

Tom´s is living from their links to amazon, so Pimax is not in amazon. So they prefer to deviate the attention to products in amazon.

 

In any case, all reviews has to be taken with caution, and the GPU problem for Pimax is well known. So it is good to not set our expectations too high. 

There were two reasons for me to a backer of Pimax:

1.- Just to help companies which delivers what I want now (FOV and resolution). I don´t want now varifocal stuff.

2.- Just to test myself the device. It is is better than Rift I will keep it, If not I will sell it.

Edited by chiliwili69

Share this post


Link to post
Share on other sites

Where is my 2080Ti and why did they send me scrapmetal instead? 🤣This piece of junk can barely do an overclock that is stable enough for 3DMark. This is only a few percent points faster than my overclocked 1080Ti. I don't hold much hope of it providing any benefit over a 1080Ti. But I'm still playing around with it and haven't compared in-game fps, yet.

[edit] Well, VRMark tells a different story. Lots of gain, but don't know yet if that translates to real-world fps gain.

Edited by scrapmetal
  • Like 1

Share this post


Link to post
Share on other sites

So basically unless we get massively increased speeds on our cpu's in the near future this things basically DOA. I've been playing Escape from Tarkov for the last week, just been fed up with the performance in VR with even a monster rig. I know Il2 wasn't built for VR, but having nearly the best hardware money can buy and still having crap performance is tiring. We need much faster cpu's or il2 to be completely rewritten to take full advantage of multicore cpu's. Jumping around 1-4 threads and using 50% of those cores isn't good enough. So Unless 7nm is going to be a massive leap, think I will just give up on VR.

  • Like 2

Share this post


Link to post
Share on other sites

I think it depends what you're looking for in the game. The immersion of playing in VR is incredible, but I suspect for absolute best competitive play you want a 1440p display with head tracking. VR can be an advantage in a dogfight but for spotting and ease of checking six, VR lags behind a lot.

Share this post


Link to post
Share on other sites

The various posters who state that cpu utilisation figures are incorrct and misleading are correct (the people are right that the utilsation figure is wrong... i think this is worded better. :) )

 

this is worth a read. http://www.brendangregg.com/blog/2017-05-09/cpu-utilization-is-wrong.htm

 

We seem to be fixing a software optimisation problem with brute force; my waiting for the gtx3080ti, others looking for cpu’s that can work out the ultimate question (the answer is 42)

 

when the software and hardware get in sync vr is going to be even more incredible. I think I’ll be going for the pimax 5k+

Share this post


Link to post
Share on other sites

Just a consequence of the recent hacks.

 

" Thanks for your reminding, the Pimax main site is under fix now, the website recently got a lot of attacks and we plan to rebuild the website architecture for better security. we will update a temporary web page by today. "

Share this post


Link to post
Share on other sites
10 hours ago, 15th_JonRedcorn said:

We need much faster cpu's or il2 to be completely rewritten to take full advantage of multicore cpu's. Jumping around 1-4 threads and using 50% of those cores isn't good enough. 

 

Same with X-plane 11.  It makes use of HZ over number of cores. It must be a mountain of work to get the apps to make greater use of the cores available.

Edited by TP_Merlin

Share this post


Link to post
Share on other sites
4 hours ago, 1./KG4_ReggiePerrin said:

 

Link is not working hope this does http://www.brendangregg.com/blog/2017-05-09/cpu-utilization-is-wrong.html

Keep in your mind, it looks like this is an Linux blog. 

edit Hmm, the missing l from html 😉

Edited by Dutch2

Share this post


Link to post
Share on other sites
10 hours ago, Alonzo said:

VR can be an advantage in a dogfight but for spotting and ease of checking six, VR lags behind a lot.

 

Spotting in VR is a lot easier thanks to bigger dots. IDing planes is the problem.

  • Upvote 1

Share this post


Link to post
Share on other sites
20 hours ago, 1./KG4_ReggiePerrin said:

The various posters who state that cpu utilisation figures are incorrct and misleading are correct (the people are right that the utilsation figure is wrong... i think this is worded better. :) )

 

this is worth a read. http://www.brendangregg.com/blog/2017-05-09/cpu-utilization-is-wrong.htm

 

This article states that external (DDR3/4 in our case) memory access time is included in total CPU busy time. This is true on any architecture, not only on x86/amd64. And this stuff happens at a lower level than OS and it does not matter if you run linux or windows or bare metal.

 

We already know that external memory speed affects FPS in IL-2 which means that some significant part of CPU busy time is stalling on memory access. In theory larger cache gives less external memory access. In practice large caches are available on high core count CPUs and those CPUs have low clock freqs.

Share this post


Link to post
Share on other sites

Pimax -

 

Quote

About 60-70% performance improved on 2080Ti

 

The team have done some quick comparison test with 50+ games during the holiday on RTX2080Ti,
Here are some draft conclusions:

 

1. We have achieved ~35% performance improvement by implementing a new rendering algorithm but it's not backward compatible.

 

"for example: 

a. Pavlov VR, Batman: Arkham VR, IL-2 Sturmovik: Battle of Stalingrad;.....
old Unreal engine Not supported"

 

2. RTX2080Ti VS GTX1080Ti you will get ~30% performance improvement.

 

So when you combined solution 1 and solution 2. you can totally get more ~60% performance, taking GTX1080Ti as the benchmark .

 

 

 

http://forum.pimaxvr.com/t/about-60-70-performance-improved-on-rtx2080ti/9072

 

This came out yesterday so I don't believe the latest reviews incorporate Solution 1.  It looks like there may be work needed on the 1CGS side of things to take advantage of the performance improvement offered by Solution 1.

 

EDIT:  I guess the new rendering algorithm is not "new" but was already announced on July 31.

 

EDIT 2: I have been informed that Pimax's statement regarding IL-2's use of Unreal is incorrect . 

 

Edited by VBF-12_Stick-95

Share this post


Link to post
Share on other sites
23 hours ago, VBF-12_Stick-95 said:

EDIT 2: I have been informed that Pimax's statement regarding IL-2's use of Unreal is incorrect .

 

Jason made a statement about that yesterday in the same thread of the Pimax forum:

Quote

Hello Pimax,

Please do not say that the engine for IL-2 Sturmovik: Battle of Stalingrad is powered by Unreal or Unity. It is NOT in any way powered by Unreal or Unity. Our Digital Warfare Engine is a custom graphics and physics engine built from scratch and improved over many years now. We are not an off-the-shelf engine like Unreal or Unity and therefore any such changes to how VR is rendered is potentially a lot of difficult work. Even so, our current VR implementation is arguably one of the best out there when you consider how large our environment is and certainly the best when it comes to combat flight-sims. We will support VR technology whenever feasible for our small team.

Regards,

Jason Williams
Executive Producer
IL-2 Sturmovik: Great Battle Series

 

http://forum.pimaxvr.com/t/about-60-70-performance-improved-on-rtx2080ti/9072/101?u=r.m

 

But still, I am confident, that Pimax will provide us with better performance in the future. It seems they are trying hard at least:

Quote
 

Several additional improved solutions are to be done.
1. we just found some extra rendered vertical FOV which exceed the real vertical FOV. that waste about ~10% of current total rendering pixels. we will try to reduce the rendered vertical FOV but keep some margin and not impact the real visual FOV. Hope this helps us improve the whole performance 4%~6%.

  1. Nivida keeps improving their driver for RTX2080/RTX2080Ti. This might help a bit. 3%-6%?

  2. We're developing ASW. ASW will help a lot for the real VR experience while the FPS drop off.

  3. 72Hz or 65Hz optional mode will do helps, might another ~8-10%?

Planned to be supported by Pimax exclusive SDK in the future.
fix foveated rendering and other new techs from VR works or LiquidVR.
60Hz*2 for BrainWrap in the future, we're trying.

We believe we put more effort into the software fine-tuning and optimization, more chance of performance improvement we can achieve. Since Pimax 8K/5K+ and RTX2080/2080Ti are newly born, they definitely have some margin to be optimized.

 

http://forum.pimaxvr.com/t/about-60-70-performance-improved-on-rtx2080ti/9072/94?u=r.m

Share this post


Link to post
Share on other sites
4 hours ago, SvAF/F16_radek said:

The news that mrJason is monitoring vr development that closely, is the best we've had lately.

 

Indeed. Fingers crossed for Depth Buffer being passed to SteamVR so it can get into the Oculus ASW 2.0 algorithm.

Share this post


Link to post
Share on other sites
17 hours ago, Alonzo said:

 

Indeed. Fingers crossed for Depth Buffer being passed to SteamVR so it can get into the Oculus ASW 2.0 algorithm.


Yes I suppose that would be the main issue, having to send it through steamVR. But if that's not already a possibility I'd guess oculus would very interested in having it as one. The upcoming Quest will likely be dependant on every hack and trick there is. Well, unless they want to go full closed garden with it.

Share this post


Link to post
Share on other sites
On 10/7/2018 at 7:53 AM, TUS_Samuel said:

In practice large caches are available on high core count CPUs and those CPUs have low clock freqs.

the i9-9900K has 16Mb cache and OC to 5.0GHz. More cache than the 8700K or 8086K which has 12Mb.

Share this post


Link to post
Share on other sites
On ‎10‎/‎6‎/‎2018 at 2:33 AM, 15th_JonRedcorn said:

So basically unless we get massively increased speeds on our cpu's in the near future this things basically DOA. I've been playing Escape from Tarkov for the last week, just been fed up with the performance in VR with even a monster rig. I know Il2 wasn't built for VR, but having nearly the best hardware money can buy and still having crap performance is tiring. We need much faster cpu's or il2 to be completely rewritten to take full advantage of multicore cpu's. Jumping around 1-4 threads and using 50% of those cores isn't good enough. So Unless 7nm is going to be a massive leap, think I will just give up on VR.

strange it runs really well on my 16 cores.....maybe you need some more memory.

Share this post


Link to post
Share on other sites
1 hour ago, Henree said:

strange it runs really well on my 16 cores.....maybe you need some more memory.

 

It runs great on my 4 core i4820K.

Share this post


Link to post
Share on other sites

I'd say he's full of it and doesn't know what his actual frame rate is. Running great because you think it runs great is different then getting rock steady 90fps at good settings that don't make the game look like it's from 2005. I am curious what more than 16 gigs of ram would help with considering I am nowhere near maxing out my memory with il2. 🙄

2 hours ago, Henree said:

strange it runs really well on my 16 cores.....maybe you need some more memory.

What does run great mean? 45 frames a second on low settings?

Edited by 15th_JonRedcorn

Share this post


Link to post
Share on other sites
52 minutes ago, 15th_JonRedcorn said:

I'd say he's full of it and doesn't know what his actual frame rate is. Running great because you think it runs great is different then getting rock steady 90fps at good settings that don't make the game look like it's from 2005.

 

I'm skeptical too, but there might be something crazy with IL2 like the working set for the game is 25MB and if you can fit it all in CPU cache suddenly the whole thing runs amazingly faster, so a low-clocked large-cache Xeon is really really good. But we'd need someone to actually test that with controlled settings for me to believe them.

 

Possible CPUs to test, if anyone has one: Threadripper 1950X (32MB cache), i9-7980XE (24.5MB cache). Probably loads more Xeon-ish chips.

Share this post


Link to post
Share on other sites
On 10/9/2018 at 7:02 PM, chiliwili69 said:

the i9-9900K has 16Mb cache and OC to 5.0GHz. More cache than the 8700K or 8086K which has 12Mb

 

Well, I think (only think) that we should look for L3 cache per core, not total cache. So, the 8700K and 8086K has the same L3 cache per core

Looking at this table maybe a delided 8086K would be as good as the 9900K:

2018-10-08-12_23_15-Intel-Announces-9th-Gen-Core-CPUs_-Core-i9-9900K-8-Core-i7-9700K-i5-9600K.png.e4d38071998cccfd9ac5fc28084d2f86.png

Share this post


Link to post
Share on other sites
15 hours ago, Alonzo said:

so a low-clocked large-cache Xeon is really really good

 

I don´t think Xeon processors are better for IL-2 VR.

In the company where I work we use an application (Dynamic process simulation) which is very much single-thread, and it is also very much correlated to the Single-Thread Mark (STMark) of Passmark.

The RTF is our benchmark, it would be our fps in IL-2 VR terms:

chart.png.25191c59d89c0248cc364315f59a3c30.png

 

We have tested several CPUs in the past 8 years, including an Xeon E5-2643 with 20Mb cache. But the performance was much lower (70 vs 99) than the 8700K at 5.0GHz:

1562794752_RTFXeon.png.11bb8ba9b7c22f77f3bb26da36fe25a3.png

 

Many of the IT departments of our customers insisted in using virtualized PC over Xeon servers to run our application, but I have to convince them to run in "gaming" CPUs to achieve better performance!

15 hours ago, Alonzo said:

i9-7980XE (24.5MB cache).

 

If you look at the "Monitor 3.0", Zacharias was runing the test in monitor with a i9-7900X chip. But he was not performing better than a modest 4790K.

https://docs.google.com/spreadsheets/d/1gJmnz_nVxI6_dG_UYNCCpZVK2-f8NBy-y1gia77Hu_k

Share this post


Link to post
Share on other sites

This is the first time I’ve viewed this thread & an Oculus user ( a pretty happy one I must say..) but it got me wondering : in the year since this topic was started -what’s the status of the Pima headset?

Share this post


Link to post
Share on other sites
47 minutes ago, Blitzen said:

This is the first time I’ve viewed this thread & an Oculus user ( a pretty happy one I must say..) but it got me wondering : in the year since this topic was started -what’s the status of the Pima headset?

 

similar to Oculus DK2. A year before a commercial release.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...