Jump to content
Dadtom65

AMD 3900X verses I9 9900K

Recommended Posts

Hi guys have been thinking for awhile now about up grading my computer, maybe towards the end of this year or possibly early next year. Have been checking the stats of both CPU's and found out that the AMD CPU is way above the stats of the Intel CPU or seems to be. Has anyone used that AMD CPU in say Xplane or P3D and what do you think. Mean have been with Intel for years so have no idea about whats best. Any info would be welcome. Thanks Derek.

Share this post


Link to post
Share on other sites

Hi
I had a 9900k and also changed a 3950x
I use my 3950x as 12 cores, with 16 cores I had bad fps
  12 vs 8 core there is no difference, but intel only has the advantage of the high clock

Share this post


Link to post
Share on other sites

It appears to me from watching and reading other simmers' experiences that Intel's 9900 and 9900K are able to run P3D and XP at higher settings than any of AMD's. However, where the Ryzens shine is at running sims smoothly (no infamous Intel stutters) with lower settings, doing it a cheaper price and doing it cooler.

Check out the I9-9900K thread:
https://orbxsystems.com/forum/topic/172742-i9-9900k/

Thing is, at the end of this year both Intel's new 'Comet Lake' processors (10000-series) and AMD's new 'Zen 3' (Ryzen 4000-series) will both have been launched.
That will make the current bleeding edge CPUs cheaper (whilst they remain in stock). It will be up to you to decide whether price:performance ratio matters more than absolute performance unencumbered by budgetary concerns.
I wouldn't go for any Threadripper CPUs by AMD. As PerfectFlight points out, the extra cores give no extra performance (in fact worsen it) as workload is spread across too many cores and is more expensive.


Apart from the most demanding scenarios, e.g. Over Central London in TE GB, approaching Aerosoft's EGLL rwy 27L in the PMDG 737NGX with AI and weather on, FPS dropped no lower than 21, generally held at 24 / 25.
Nearly every other scenario I have FPS locked at a very steady 30.

You can see my PC specs in my signature below.
 

Share this post


Link to post
Share on other sites
Posted (edited)

Thanks guys for the heads up and as someone said there’s more new stuff on the way. Maybe I will wait for awhile and see what happens. Derek.

Edited by Dadtom65

Share this post


Link to post
Share on other sites

Hi

 

I decided on the Ryzen 3800X over the 3900X as it seems to be the sweet spot in amount of cores and threads and still delivers single threaded performance on par with the 9900K.

 

The Intel comet lake is unfortunately just a regurgitation of the old 14nm architecture and will likely not have a decent upgrade path, where as the AMD Ryzen are not only maintaining an excellent upgrade path but they are using a modern 7nm architecture that is proving to be much more power efficient and run cooler.

 

Although the clock speeds are higher for the Intel CPU's, the clock for clock IPC performance is better with the AMD Ryzen and will be even better with the new upcoming 4000 series CPU's.

 

 

ST 3800X.PNG

Share this post


Link to post
Share on other sites

I Know absolutely nothing about either of the 2 brands But--

Take a google at the specs just published for the New X Box series X

and the new PS 5

Both have chosen the same AMD Ryzen and presumably  - both Microsoft and

Sony know what they are about when it comes to performance in a Gaming nature

and which will need to last for around 5 years before the next gen boxes

  • Like 1

Share this post


Link to post
Share on other sites

Thank you both for the last comments. I will now have to wait even further as last night our dam washing machine packed with a smell of burning as had our dishwasher a few months earlier. >:(. In hindsight they were both quite old like me.  Life can be a problem some times plus with the current state of affairs god knows when we can get out again. Derek.

  • Like 1

Share this post


Link to post
Share on other sites
20 minutes ago, Dadtom65 said:

Thank you both for the last comments. I will now have to wait even further as last night our dam washing machine packed with a smell of burning as had our dishwasher a few months earlier. >:(. In hindsight they were both quite old like me.  Life can be a problem some times plus with the current state of affairs god knows when we can get out again. Derek.

No worries Derek Cheerup:)

 

No doubt you can find a Dolly Tub and a Posser somewhere:rolleyes:

  • Haha 1

Share this post


Link to post
Share on other sites

Hi Derek

Just came across the latest quality video on the difference between PS5 and X BOX in total

Note the first descriptions of the AMD as I described earlier

 

 

Also take note of the whole specs and compare them with your PC now - work out how much it will cost you to match

your PC - to the new X Box Series X - and will it do 4Kx120 FPS/ 8Kx60FPS - and cost a lot less than your new Washer

and you have time until Nov to save up the 400 quid - but you will need a  new bit of hardware - (Flight Yoke) for 2020

and a new large 4K/8K large TV as a monitor

 

MS 2020 will be the icing on the Spec! Keep the present PC for all the usual PC things - and in your old age - enjoy the new sim and brilliant games - 

I love the F1 racing and the road racing in the Lake District in Forza 4

Share this post


Link to post
Share on other sites
Posted (edited)

Hi John thanks for the video and yes it looks like the AMD Cpu is better than the Intel one at the moment. In fact there was too much info and I got lost halfway through. One thing that PlayStation looked a bit weird looking to me. Derek.

Edited by Dadtom65

Share this post


Link to post
Share on other sites
On 3/29/2020 at 5:28 AM, John Heaton said:

Also take note of the whole specs and compare them with your PC now - work out how much it will cost you to match

your PC

Indeed. I was astonished by that as well.

Share this post


Link to post
Share on other sites

Hi John I would be dead in a few seconds if I said no I wanted a new comp first than a washing machine. :lol:  plus already got one coming today, washing machine that is. That cost about the same as a new AMD Cpu. No worries my time will come. Derek 

  • Haha 1

Share this post


Link to post
Share on other sites

I don't think your flight sim will run very well on the washing machine, but she who must be obeyed will be happy :)

  • Like 1

Share this post


Link to post
Share on other sites
5 hours ago, Taph said:

I don't think your flight sim will run very well on the washing machine

Well... if you have your sim running on a laptop, you can always put your laptop on the washing machine. I bet it'l still run. :rolleyes:

  • Haha 1

Share this post


Link to post
Share on other sites

that's one way of taking your plane for a spin I suppose :P

Share this post


Link to post
Share on other sites

Well she got her washing machine but in the rush to get it going we forgot some thing ?. I tell you what it was like an aircraft bouncing around the kitchen. Three times we tried to get it going properly and three times there was three of us trying to hold it down. This was when it was spinning. At the time we thought it was because it was not level until we reread the instructions and found that we had not removed the transport locks. I really thought it would be knackered but the wife tried it again today after our son had finished work and thank god it was OK and spinning without us sitting on it.  :lol: We were very lucky. Derek. 

  • Haha 1

Share this post


Link to post
Share on other sites

 :lol: Yes, you always need to remove the chocks before flight. Might need to put red flags on them..

  • Like 1
  • Haha 2

Share this post


Link to post
Share on other sites
9 hours ago, Dadtom65 said:

Well she got her washing machine but in the rush to get it going we forgot some thing ?. I tell you what it was like an aircraft bouncing around the kitchen. Three times we tried to get it going properly and three times there was three of us trying to hold it down. This was when it was spinning. At the time we thought it was because it was not level until we reread the instructions and found that we had not removed the transport locks. I really thought it would be knackered but the wife tried it again today after our son had finished work and thank god it was OK and spinning without us sitting on it.  :lol: We were very lucky. Derek. 

Derek

I thought to ask - " did it come with a Checklist" -??

 

Then I noticed The Date  :rolleyes:

 

A quote comes to mind - "Many a True Word spoken in Jest"

or - another - "no FOOL like an OLD fool  

or - When all else fails - read the B#$%^&  Instructions  ;)

Share this post


Link to post
Share on other sites

Thank you so much John and yes your right on all three counts. :lol:.Derek.

Share this post


Link to post
Share on other sites
Posted (edited)

Glad the washing machine is sorted, and gives you time now to save for a 4th Gen Ryzen.

I  have just updated to a Ryzen 3900X (from an R 1700) and am super happy with the utter smoothness running P3D, both v4 and now v5 (since yesterday).

 

My P3D is used only at full 4K resolution, and the R1700 (8 core/16 thread) running at just 3.4 GHz already gave very smooth sim performance locked to 30 fps. The sim would use up to 100% on all 16 threads. @Doug Sawatzky gives good advice, and the newer, faster 3800X will be waaayyy better. Waiting for the 4th gen (prob 6 months away) will be better still. Faster and more efficient again.

 

I've now seen in P3Dv4, my R3900X run up to 100% on all 24 threads at native unoverclocked 4.2 GHz,  and the sim remains butter smooth. My Settings\World sliders are all maxed. This is in super dense photoreal areas and some super heavy airports. I know I should have waited for 4th gen, but hey, lockdown blues...

A big AMD advantage is generally the CPUs come with a perfectly satisfactory cooler (no added expense) and the motherboards are lower cost. You can even get an older (or even used) B450 motherboard and it will run just fine.. My shiny new 3900X is running in my old X370 motherboard with only a BIOS update. B450 and X450 MBs are available and cheap, but best is an X570 MB, and there are reasonably priced ones available. Some X570 boards will take 3 x M.2 NVMe sticks (super fast SSDs) - handy for simming installations.

 

Please know that AMD and Intel CPUs work quite differently. With Intel, for simming you really want a CPU/MB/cooler solution that gets you 5+ GHz. And even then you will occasionally see the notorious Intel stutter. I've seen a 9900K stutter in P3Dv4. I think they just get used to it until they no longer notice it, or secretly accept it. It is the elephant in the room, while they bang on about 5+ GHz, and overclocking,  and AMD must be so inferior coz it can't do 5 GHz. AMD chips render graphics much better than Intel chips, and AMD just doesn't need 5GHz for good sim performance. So, a high end Intel setup may ultimately give you higher framerates, but not smoother framerates. See here for what just 3.4 GHz can do in P3Dv4     https://imgur.com/gallery/LuU1ruC      All settings used are displayed.

 

Now this is for P3D, which does use all your CPU resources. For X-Plane, even with the new Vulcan graphics update, it is still basically a single thread program, and high GHz will help here. I have X-Plane, but mainly use P3D as it is more visually satisfying (smoother) for me. I lock P3D to 30fps and don't bother with overclocking and tweaking of CPU/MB/graphics card. It all works thoroughly well at stock. No fussing.

 

If you go to a movie, or watch TV, it generally looks smooth, even though it us just 24 or 25 fps. The human eye doesn't see more than 30 fps. I have amusedly watched flame wars with guys who say than can see the difference between 144 fps and 200 fps. Just not anatomically or biochemically possible. Nerves, including vision nerves, need re-polarisation time. Nerves don't really pass electrical current down the nerve, like a wire does. On each side of the cell wall of a nerve filament, there are +ve ions on one side and -ve ions on the other side of the cell wall. When the nerve is stimulated to send a signal, these ions swap sides in a wave down the cell, propagating the signal. Then the nerve needs these ions to re-polarise back to their original side, by drifting back across the cell wall, before a new impulse can be propagated. This happens quickly, but for nearly everyone, 30 Hz is about as quick as it goes. Maybe some can tell a difference between 30 Hz and 60 Hz monitors and fps, but most of us can't. Nerves can't be overclocked.

 

So, I am happy simming at 30 fps. and really like both of my AMD Ryzen CPUs for the smoothness at 30 fps. No Intel stutter. The 3900X is just better still than the 1700. 4th gen will be better and more efficient again.

 

I won't discourage you from getting an Intel CPU. They are good, too, but for my use case, AMD just works better (including certain important non-sim things I do). Follow Doug's advice and you will be happy.

Cheers.

Edited by ozboater
The link
  • Like 1

Share this post


Link to post
Share on other sites
47 minutes ago, ozboater said:

If you go to a movie, or watch TV, it generally looks smooth, even though it us just 24 or 25 fps. The human eye doesn't see more than 30 fps. I have amusedly watched flame wars with guys who say than can see the difference between 144 fps and 200 fps. Just not anatomically or biochemically possible. Nerves, including vision nerves, need re-polarisation time. Nerves don't really pass electrical current down the nerve, like a wire does. On each side of the cell wall of a nerve filament, there are +ve ions on one side and -ve ions on the other side of the cell wall. When the nerve is stimulated to send a signal, these ions swap sides in a wave down the cell, propagating the signal. Then the nerve needs these ions to re-polarise back to their original side, by drifting back across the cell wall, before a new impulse can be propagated. This happens quickly, but for nearly everyone, 30 Hz is about as quick as it goes. Maybe some can tell a difference between 30 Hz and 60 Hz monitors and fps, but most of us can't. Nerves can't be overclocked.


I was happily reading your experience within P3D and was nodding along with a lot of your comments regarding Ryzen.

However, I have to disagree with your comments about the human eye.
I looked into it about 6 months ago:
https://orbxsystems.com/forum/topic/181181-frame-rates/?do=findComment&comment=1575714

 

Found that the 24 FPS relates to cinema film costs, relative motion and the way the brain interprets it all to give a moving image.
The brain can interpret well in excess of 30 FPS.

 

Anyway, I agree. P3D is not a single thread program and benefits from using more cores.
With a relatively strong Ryzen PC, I can get a good performance experience from P3Dv4 (usually stays at 30 FPS locked) at power draw, temparature and noise levels that are very acceptable for the settings that I use.

 

Share this post


Link to post
Share on other sites

Thanks 737,  I will check this out.

Personally, and fortunately, 30 fps works for me.

 

  • Upvote 1

Share this post


Link to post
Share on other sites

@Dadtom65.  New confirmed AMD info gives you a nice savings schedule if you are thinking of going that way.

 

It has now been confirmed that the next gen Ryzen 4000 CPUs will release in August-September this year.

More importantly, they will be compatible with 400 series motherboards. That is B450 and up, but also a new 600 series platform will be available.

 

Hope this helps.

Share this post


Link to post
Share on other sites
16 hours ago, ozboater said:

This happens quickly, but for nearly everyone, 30 Hz is about as quick as it goes. Maybe some can tell a difference between 30 Hz and 60 Hz monitors and fps, but most of us can't. Nerves can't be overclocked.

 

Nice try to explain it with the refraction phase, but you "forgot" one essential part that is also different when comparing an eye and the nerve transmission to the brain to a computerized system: the biological thing is continuously working ;-) Or do you have a shutter in your eyes? I guess no, so the light is continously arriving on your retina and as such, the image is continously processed. It is not like every 16ms your retina takes a picture and then the retina and all its cells are shut down. Means: the millions of cells in your retina are constantly activated, fire a nerve impulse and then go in the refraction phase for about 16ms (or even more). But while one cell is shut down, there are millions of other cells ready to be activated. Always, at any time. That is why you are for sure able to discriminate easily if something is shown you with 30FPS or with 60 or with 120. Some airforce pilots were even capable of correctly identifying an airplane shown for 1/200th of a second.

  • Upvote 1

Share this post


Link to post
Share on other sites

Thanks guys for your help but as they say I’m old and hate change so thinking I will be sticking to an Intel like an I9 or something. Thanks Derek.

Share this post


Link to post
Share on other sites
Posted (edited)
2 hours ago, AnkH said:

 

Some airforce pilots were even capable of correctly identifying an airplane shown for 1/200th of a second.

Wow, I find that utterly amazing. Seems like my 30 fps eyes are just lazy :)

 

@Dadtom65

Glad you've decided, and you will enjoy your upgrade, as capability has really moved on.

FWIW, since you are planning an upgrade, it will be worth considering a graphics card with as much memory as possible. I've occasionally (though definitely not regularly) had P3D, both v4 and v5, just quit to desktop with an out of memory error. It is not RAM in this case (I've got 32 Gb) but graphics memory. My card is an 8 Gb GTX 1080, but I've also heard of this happening to some with 11 Gb 1080Ti and 2080Ti. It won't be so much for the speed as for the memory capacity. The new nVidia 3080Ti is just about hitting the market, and at least 1 model has 16 GB DDR6 memory, but is an additional $100 to $200 even more expensive than a current 11 Gb 2080Ti.

The reading I've done suggests that with the high end graphics cards, there is not a consequent leap in performance like one would expect, as you move up the stack. But, for flight simming, it looks like graphics memory capacity will be an important factor for the future. So, if you are planning on 4K simming, I'd recommend a 16 Gb card if you can.

 

PS. This has also happened in an un-named sim I am testing. Task manager shows a memory spike to max Gx memory, and then poof... it's gone.

Edited by ozboater
Add the PS

Share this post


Link to post
Share on other sites
39 minutes ago, ozboater said:

 he new nVidia 3080Ti is just about hitting the market, and at least 1 model has 16 GB DDR6 memory, but is an additional $100 to $200 even more expensive than a current 11 Gb 2080Ti.

 

Do you have a reliable source for this? As I skipped the 2080Ti, I am really looking forward to a 3080Ti, but if it is way too expensive...

Share this post


Link to post
Share on other sites
On 3/29/2020 at 4:28 AM, John Heaton said:

Hi Derek

Just came across the latest quality video on the difference between PS5 and X BOX in total

Note the first descriptions of the AMD as I described earlier

Also take note of the whole specs and compare them with your PC now - work out how much it will cost you to match

your PC - to the new X Box Series X - and will it do 4Kx120 FPS/ 8Kx60FPS - and cost a lot less than your new Washer

and you have time until Nov to save up the 400 quid - but you will need a  new bit of hardware - (Flight Yoke) for 2020

and a new large 4K/8K large TV as a monitor

MS 2020 will be the icing on the Spec! Keep the present PC for all the usual PC things - and in your old age - enjoy the new sim and brilliant games - 

I love the F1 racing and the road racing in the Lake District in Forza 4

 

John,  having built several computers for flight sims including my latest a i9900ks Watercooled (Oc 5.3Ghz )and with a RTX 2080 - oc 2086 ) it performs only maybe 5% better frame rates than my son's  I7 8700K (Oc to 5.2) and a GTX 1080Ti .

 

The real limiting factor is the fact the sims currently on offer don't use more than 1 core for there main processing of physics and graphics so the key is having one fast core and the rest just muddle along with back ground processing . These sims are so CPU bound that the graphics cards are underutilised.

 

The problem with AMD is that while they are multi core ad perform well in synthetic testing or advanced rendering for CGI but  they just don't have anywhere near the clock speeds for a single core or quad core compared to Int, so the bottom line for current flight sims Intel is still the best choice for sims like Xplane and P3d.

 

Maybe the yet to be released MSFS 2020 may well better utilise more cores so maybe wait until that is released as this may well be a game changer.

 

These console machines are designed for games that are multicore so its like comparing a drag racer with a Ferrari, totally different purposes. This is why I have a high spec computer as I also play shooters with Ray Tracing so it ultilises the hardware better but for flight sims at the moment probably a top spec computer with a i7 6600K and 1080ti would do just as well as my i9 9900ks for x plane and P3d.

 

This is not a discussion on value for money but if you have the money then Intel is currently the best but AMD will get you value.

 

Feel free to PM me if you want some help.

 

Hope that helps.

  • Upvote 1

Share this post


Link to post
Share on other sites
Posted (edited)

Correction - and very glad you asked...

Speedread,  and consequently misread it tonight on Tom's Hardware (I'll blame my rheumy 30 fps eyes).

 

MSI has released the first 16 Gb GDDR  2080 Ti.

They aim to differentiate from other manufacturers who offer the now vanilla 11 Gb

This will be an eyewatering $2000+ in Australia. I can live with the occasional CTD for the moment.  US$ 100 to 200 premium over 11 Gb cards.

 

Since I have heard that some 11 Gb cards have crashed the sim to desktop from lack of video memory with 4K, the 16 Gb version could be the best choice for 4K P3D. Don't know about other sims as currently I'm only using P3Dv5, and testing another in 4K.

 

PS, I don't think there is that much of a performance boost with 16 Gb  GDDR6 - I gather just 2 fps over an 11 Gb card. But for 4K simming, it could be the grail.

Edited by ozboater
Correction and PS

Share this post


Link to post
Share on other sites
1 hour ago, Mawson said:

 

The real limiting factor is the fact the sims currently on offer don't use more than 1 core for there main processing of physics and graphics so the key is having one fast core and the rest just muddle along with back ground processing . These sims are so CPU bound that the graphics cards are underutilised.

 

Regardless, I have now seen in P3dv5, spikes up to 100%on ALL 24 threads. I'm glad I have a program that can max out the horsepower of my cpu. As you say, even if the main sim processing is happening on 1 core, that is a fuzing lot of horsepower spent on other sim processing tasks. Just think for a moment about maxing out 24 threads...

 

I don't know how P3D is programmed to use CPU resources, but I read recently in an FS Elite interview with a Lockheed Martin staffer that for v5, LM have laboured to shed off more tasks to other cores. Seems to me that it is working nicely, as my sim performance is very smooth.

 

Not being a computer engineer, my empirical observation is that AMD and Intel cpus work quite differently, and thus excel in different areas. There is no doubt that Ryzen cpus render at a higher level than comparable Intel cpus. Many, many benchmarks demonstrate this. My understanding is that with P3D, all this other cpu processing is probably graphics rendering, and this is how Ryzen cpus deliver smoother performance than equivalent Intel cpus.

 

I have seen occasional micro stuttering even on a 9900K with 2080 Ti running P3D. I don't see that on with my Ryzen cpu, even running an old 1080 card in 4K.

 

With a current Intel cpu, you really need the high GHz for smoother perfrmance, because it works differently. A simple comparison of clock rate is just that, too simple.

 

Anyway, in the end, for me, the ultimate test is desktop performance, and experiencing P3D onscreen. I don't care how fast a cpu is if the screen shows stuttering. A real bummer especially when landing.

 

I recall a guy crowing about his 60 fps with his 5.2  all core overclock on his new watercooled Intel cpu. Yes, mostly he got solid 60 fps, but he ignored the frequent stutter, especially in complex areas. At one stage his screen readout showed a momentary drop to just 4 fps. That is not really 60 fps, no matter how you want to explain it away. He'd just become used to it so he ignored it.

 

It seems to me that the Intel stutter is the elephant in the room that everyone doesn't want to talk about, or have just learned to ignore, and get very defensive about it. High fps is not really high fps if there is stutter. 'Oh, it is 'mostly' 60 fps...'

 

The AMD and Intel cpus just work differently, and clock speed is no longer a reliable indicator of sim performance.

And high fps is not true fps if it stutters, or even micro stutters to lower fps. It is mostly high, but with stutter.

 

I do totally agree with your observation on graphics cards. My reading around the internet also seems to show that the level of graphics card is currently not so important in on screen sim performance. That's good for me, as it reduces the pressure to update my ageing plain Jane 1080. The amount of graphics card memory does seem to help reduce sporadic CTD. Even 11Gb Ti users have reported this. For them it will be less often than the occasional CTD I get on 8 Gb. Both P3D 4 and 5  (and a sim I'm testing) very occasionaly do this when Gx memory spikes above the card's limit. MSI just released a 16 Gb GDDR6 2080 Ti card which could alleviate this. I suspect there will be more to learn about this, but fortunately it is infrequent. 4K and photoreal with highly maxed sliders (which is where I like to sim) seems to place occasionally a lot of GDDR demands.

Cheers.

 

PS - clarification - I'm only talking about P3D, and not X-Plane or any other sim. You may well be entirely correct for X-Plane. I don't offer any opinion here as I no longer run it and have no recent experience.

Share this post


Link to post
Share on other sites

OK, I've done a lot of reading tonight about what we see on screen when simming. If anyone can elaborate to inform better and more accurately, I'm all ears and willing to learn.

There is 2 parts to what you see on screen when flight simming with P3D.

 

First part is as @Mawson says ie. having a fast core to process the mainly single core processing of the physics etc. Mawson says it is also for processing the graphics, but this is now only partly true, and limited. There is a new world with a new reality - P3D now uses more than one core for this, and LM have said that going forward they will continue to spread the processing load further.

 

The 2nd part is totally aside from the sim physics etc. and it is what DX12  is for. Rendering all the data that the main sim produces, (and no longer only on just a single core anymore for P3D), and turning that processed data into the image on your screen is what givesyou your screen experience. The better it is rendered, the fewer frames will be dropped.

As I currently understand it, think of it as trying to watch a 4K movie on a cpu with not enough horsepower - replay is jerky because frames get dropped while onscreen display tries to keep up with the progress of the movie.

 

You can have the fastest single core processing in this universe, but if your rendering is not up to getting the processed image to screen, you're gonna drop frames, and I believe this is the cause of the Intel stutter - dropped frames.

 

While an AMD cpu may not have as high a clock speed (but then it is really about IPC and NOT frequency alone) the better data rendering to screen is avoiding dropped frames, and avoiding stutter. No dropped frames = no stutter.

 

If you don't mind dropped frames and stutter, and want to bask in the satisfaction of knowing that your physics is calclated really really fast, then high clock speed will work for that. But onscreen you may only see some of that superior calculation with the momentary loss of image frames when the rending engine can't keep up with the sim.

 

Balance can give you a better screen experience. Having a cracking graphics rendering ability will help see ALL that calculated data, displaying without dropped frames ie. smooth screen (and hence, sim) performance. This is one area where AMD differs from Intel. There are hundeds and hundreds of Youtube videos by computer guys that show AMD slapping comparable Intel cpus with impunity in this area.

Note - I didn't say all areas, just graphics rendering calculations. Both AMD and Intel cpus have different strengths and weaknesses because they work differently. Eg. if you don't game at 1080p, then 1080p performance is irrelevant. If you do game at 1080p, then it is totally relevant. Intel is the undisputed king of low-res gaming.

 

It is a choice for each of us, and we are all correct for our personal priorities. Personally, I have found the smoother AMD experience more satisfying for enjoying my simming time.

 

It is use of the DX12 rendering engine that is doing the heavy graphics lifting behind the scene. DX12 can now use more of your horsepower. When Mawson says 'the rest [cores] just muddle along with back ground processing' that is what DX12 is doing to display your onscreen image.

Share this post


Link to post
Share on other sites

Also, unfortunately...when the 4000 series AMD Ryzen are launched later this year, because there will be another substantial IPC performance improvement and power efficiency gains, the intel stuff will become basically irrelevant. Intel has nothing new coming down the pipe for several years and even by then AMD will have made even more gains.

  • Like 1

Share this post


Link to post
Share on other sites
Posted (edited)
17 hours ago, ozboater said:

 

Regardless, I have now seen in P3dv5, spikes up to 100%on ALL 24 threads. I'm glad I have a program that can max out the horsepower of my cpu. As you say, even if the main sim processing is happening on 1 core, that is a fuzing lot of horsepower spent on other sim processing tasks. Just think for a moment about maxing out 24 threads...

 

I don't know how P3D is programmed to use CPU resources, but I read recently in an FS Elite interview with a Lockheed Martin staffer that for v5, LM have laboured to shed off more tasks to other cores. Seems to me that it is working nicely, as my sim performance is very smooth.

 

Not being a computer engineer, my empirical observation is that AMD and Intel cpus work quite differently, and thus excel in different areas. There is no doubt that Ryzen cpus render at a higher level than comparable Intel cpus. Many, many benchmarks demonstrate this. My understanding is that with P3D, all this other cpu processing is probably graphics rendering, and this is how Ryzen cpus deliver smoother performance than equivalent Intel cpus.

 

I have seen occasional micro stuttering even on a 9900K with 2080 Ti running P3D. I don't see that on with my Ryzen cpu, even running an old 1080 card in 4K.

 

With a current Intel cpu, you really need the high GHz for smoother perfrmance, because it works differently. A simple comparison of clock rate is just that, too simple.

 

Anyway, in the end, for me, the ultimate test is desktop performance, and experiencing P3D onscreen. I don't care how fast a cpu is if the screen shows stuttering. A real bummer especially when landing.

 

I recall a guy crowing about his 60 fps with his 5.2  all core overclock on his new watercooled Intel cpu. Yes, mostly he got solid 60 fps, but he ignored the frequent stutter, especially in complex areas. At one stage his screen readout showed a momentary drop to just 4 fps. That is not really 60 fps, no matter how you want to explain it away. He'd just become used to it so he ignored it.

 

It seems to me that the Intel stutter is the elephant in the room that everyone doesn't want to talk about, or have just learned to ignore, and get very defensive about it. High fps is not really high fps if there is stutter. 'Oh, it is 'mostly' 60 fps...'

 

The AMD and Intel cpus just work differently, and clock speed is no longer a reliable indicator of sim performance.

And high fps is not true fps if it stutters, or even micro stutters to lower fps. It is mostly high, but with stutter.

 

I do totally agree with your observation on graphics cards. My reading around the internet also seems to show that the level of graphics card is currently not so important in on screen sim performance. That's good for me, as it reduces the pressure to update my ageing plain Jane 1080. The amount of graphics card memory does seem to help reduce sporadic CTD. Even 11Gb Ti users have reported this. For them it will be less often than the occasional CTD I get on 8 Gb. Both P3D 4 and 5  (and a sim I'm testing) very occasionaly do this when Gx memory spikes above the card's limit. MSI just released a 16 Gb GDDR6 2080 Ti card which could alleviate this. I suspect there will be more to learn about this, but fortunately it is infrequent. 4K and photoreal with highly maxed sliders (which is where I like to sim) seems to place occasionally a lot of GDDR demands.

Cheers.

 

PS - clarification - I'm only talking about P3D, and not X-Plane or any other sim. You may well be entirely correct for X-Plane. I don't offer any opinion here as I no longer run it and have no recent experience.


I don’t have any stutters on my I9900ks on flat screen but there is a VR bug at the moment in VR they has been noticed . On the forums it’s seems both amd and intel have the problem so it appears it’s a program issue rather than a CPU. 
 

I also note that when loading scenery quite often P3D will use all cores 100% but in the actual sim once it’s loaded  it’s still only using mainly one core sometimes 4 mostly . 

Edited by I8Orbx
  • Upvote 1

Share this post


Link to post
Share on other sites
Posted (edited)
On 3/28/2020 at 8:58 AM, Doug Sawatzky said:

Hi

 

I decided on the Ryzen 3800X over the 3900X as it seems to be the sweet spot in amount of cores and threads and still delivers single threaded performance on par with the 9900K.

 

The Intel comet lake is unfortunately just a regurgitation of the old 14nm architecture and will likely not have a decent upgrade path, where as the AMD Ryzen are not only maintaining an excellent upgrade path but they are using a modern 7nm architecture that is proving to be much more power efficient and run cooler.

 

Although the clock speeds are higher for the Intel CPU's, the clock for clock IPC performance is better with the AMD Ryzen and will be even better with the new upcoming 4000 series CPU's.

 

 

ST 3800X.PNG


Doug , here is my I9 9700kS OC to 5.3 at Monterey using 11% of the CPU with Steam VR running but P3D not changed to 3D. During load up it went to 90% but it is still using mostly 1 core . There is no affinity masking or any other balarky. If people are saying they are getting 24 cores being used at Max then  I am yet to see more than 1 Out of 14. 

29A686D3-18DF-4708-ADB5-156FF0082ED8.jpeg

Edited by I8Orbx

Share this post


Link to post
Share on other sites
Posted (edited)

Here is 7 seconds of Orbx Monterey , fluid awesomeness and still hardly using more than 1 core . 
 

Still  maxes out at only 20% overall . 

 

I simply don’t believe that this sim would ever use 24 cores whilst flying let alone more than 4 cores . 
 

This whole AMD v Intel is still far from being called on P3D other than you need a high clock and 1 fast core . Intel still clocks the fastest . 

 

 

Edited by I8Orbx
  • Upvote 1

Share this post


Link to post
Share on other sites

Now in VR it’s still 1 core 12% using an RTX 2080TI. 
 

If AMD users are claiming 24 cores maxed out I would like to see that as it must mean P3D is optimised for AMD which is certainly not written in their release notes . 

50F3C9BF-E8B5-46EB-8B29-884EBC8DA796.jpeg

  • Upvote 1

Share this post


Link to post
Share on other sites
21 minutes ago, I8Orbx said:

Now in VR it’s still 1 core 12% using an RTX 2080TI. 
 

If AMD users are claiming 24 cores maxed out I would like to see that as it must mean P3D is optimised for AMD which is certainly not written in their release notes . 

50F3C9BF-E8B5-46EB-8B29-884EBC8DA796.jpeg

I am getting a similar useage but with an  RTX 2080Ti  overclocked to 2095. So it still all seems CPU bound like before but bound to a higher level. Maybe this new DX12 API is just so much more efficient? I agree that it is unlikely having more than 4 cores will result in any improvement in FPS and it still seems that higher clock speed is still king in this game.

  • Upvote 1

Share this post


Link to post
Share on other sites
Posted (edited)
On 3/28/2020 at 8:58 AM, Doug Sawatzky said:

Hi

 

I decided on the Ryzen 3800X over the 3900X as it seems to be the sweet spot in amount of cores and threads and still delivers single threaded performance on par with the 9900K.

 

The Intel comet lake is unfortunately just a regurgitation of the old 14nm architecture and will likely not have a decent upgrade path, where as the AMD Ryzen are not only maintaining an excellent upgrade path but they are using a modern 7nm architecture that is proving to be much more power efficient and run cooler.

 

Although the clock speeds are higher for the Intel CPU's, the clock for clock IPC performance is better with the AMD Ryzen and will be even better with the new upcoming 4000 series CPU's.

 

 

ST 3800X.PNG

Doug, sorry but until our sims are coded to actually utilise more than 4 cores all these synthetic benchmarks are actually very misleading to simmers  as they are almost totally irrelevant if not misleading. If and only if our simulators actually start using more than 1 to 4 cores ( mostly still one at the moment) then having ten zillion cores ( joking) may be a winner.    If you go to the benchmarking forums its just tech heads arguing AMD V Intel on a daily basis with all these synthetic test scores. What matters is how well it works for simmers and its hard to see AMD anywhere outperforming Intel CPU's other than on value and if you want a pretty good but not the best flight sim rig then AMD wins . 

Edited by Mawson
  • Like 1

Share this post


Link to post
Share on other sites

My bench benchmark image is just an example of the single thread\core performance of my 3800X. The single threaded clock for clock IPC performance of the AMD CPU's is now overtaking the Intel CPU's. The upcoming 4000 series AMD CPU's will have even better single thread\core IPC performance improvements.

  • Confused 1

Share this post


Link to post
Share on other sites
Posted (edited)
12 hours ago, Doug Sawatzky said:

My bench benchmark image is just an example of the single thread\core performance of my 3800X. The single threaded clock for clock IPC performance of the AMD CPU's is now overtaking the Intel CPU's. The upcoming 4000 series AMD CPU's will have even better single thread\core IPC performance improvements.

Thanks Doug but I am confused as your test is on a synthetic program isn't it?

 

We really need to see how it works on our actual sims as I know people are chatting about IPC improvements however what is the key in our SC sims is FLOPS .

 

If you only use one core you can knock out cores to 1 and it all comes down to cycles or CPU speed ( see the formula) . Our sims need a CPU that has a very effective floating point calculation ability and my understanding is this is why Intel CPU's are currently still the best for Flight sims and games that use Single to Quad Core programs. They have a superior (faster) floating point calculations.

 

image.png.fc59898db13adf0dc95f2985cc46c847.png

 

As we reach the limits of Clock Speed we need to increase IPS but this is mainly achieved by increasing the number of cores or having a more efficient instruction set in the CPU but this requires the program we use. We need to be able to use the cores and this more efficient instruction set which is something our  flight sim developers have failed to do for I guess the reason uit takes a huge rewrite and maybe destroys compatibility in the sims we love for our add ons. Then came along new API's ( Vulkan and DX12) that have released some of the processing to other cores I guess but the prime core is still working hard with it floating point calculations working out the physics of our flight model.

 

Next major change is for the developers to redo their coding to make better multi-core usage possible and to use all the new instruction sets available to get more efficient and get a higher IPC happening but I doubt this is going to be even in the near future.

 

The AMD v Intel Argument is Clockspeed V IPS and what CPU you use depends on what benefits your programs coding the most and I am fairly certain our sims benefit mostly from higher clockspeeds due to the way the program works.

 

If you want to see this in action go to X plane 11 and crank up the number of flight models per second ( usually about 2- 4 to much higher ) and watch the sim stop working as it tried to do the physics calculations more than 4 times a second. 

 

I certainly don't disagree it may all change in the future but it needs a program to use all the cores and new instruction sets and for an example of that look at the revolution in P3D V5 and X plane 11.50b4 what can happen with better instruction set. Id like to lead in to a discussion on the aspects of how this all may work with new windows 10 based  experimental  flight sim that can't be named, but I guess at the moment that is a discussion for another day and just keep that to myself, sorry.

 

This topic is of great interest to me as I build my own flight sim PC's and simpits for years and the back chatter on the coming revolution is something I really enjoy and if it is the revolution in computing it may seem to be it will be the dawn of a new era in simulators be it plane trains and automobiles.

Edited by Mawson
  • Upvote 2

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...