Tap to unmute

NVIDIA expects to get away with this...

  • Published on Sep 20, 2022 veröffentlicht
  • New Customer Exclusive, Receive a FREE 256GB SSD in Store: micro.center/osd
    Check out Micro Center’s PC Builder: micro.center/va2
    Submit your Build to Micro Center’s Build Showcase: micro.center/zsd
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: jayztwocents
    Twitter: jayztwocents
    Instagram: jayztwocents#
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • Science & TechnologyScience & Technology

Comments • 15 122

  • VGMoney
    VGMoney 2 months ago +11502

    I hope AMD/Intel can really put up some competition so NVIDIA can't pull things like this

    • elijah mashter
      elijah mashter Day ago

      @Jay Patel yeah I agree but AMD does have some hope

    • elijah mashter
      elijah mashter Day ago

      Yea I am quitting nividia and going to AMD

    • Free Zone Project
      Free Zone Project Month ago

      It's not a matter of performance.
      AMD has offered faster cards for less money repeatedly over the years and been outsold drastically.

    • ainkE
      ainkE Month ago

      All the chip companies are going to raise their prices. You know, because of the inflation and shortage of chips.

    • Redisia
      Redisia Month ago

      amd usually been better quality / price.... you can get more bang for your €$ on amd... just usually not he cream of the crop

  • InadvisableOptimism
    InadvisableOptimism Month ago +285

    Waiting to see what AMD can do.
    Might end up with a 100% team red system for the first time.

    • Puritan 74
      Puritan 74 16 days ago

      Same here

    • Chris K
      Chris K Month ago

      @electropneumatic What do you have now? I have a RX 5700 XT and a Ryzen 7 3700X. I am thinking of stepping up the GPU game since it is currently the bottleneck when it comes to high fps 1440p gaming.

    • Michael Davis
      Michael Davis Month ago

      @Kostasdxi Really AMD just needs to not auto-enroll people in their beta drivers when they install Adrenalin. If you go in and disable the option to get beta drivers (turned on by default), and just use the stable drivers, you're gonna have a much better experience.

    • Michael Davis
      Michael Davis Month ago

      I'm 100% team Red now for the first time, because of the way Nvidia has been behaving and pricing their cards. Which is a shame, cause I used to be a hardcore Nvidia fan boy. I still own and use an OG Shield TV as my media center!
      I recently upgraded to a 6750 XT from a RTX 2060... got it new for the same price as a new 3060 Ti. The 6750 is at least 7% better performance vs. benchmarks I've seen for the 3060 Ti in my games (Guardians of the Galaxy, Vermintide 2, VR games, Half Life: Alyx, etc.). In my experience I'm actually seeing greater than 10% gains... not sure if that's because I'm rocking an AMD board and CPU, or what. On top of that, 6750 has 12 GB of RAM vs. the 3060's 8 GB. So a much better value proposition for me in the long term. I was worried about driver issues, but after two weeks I've seen no problems with stuttering or any weird behavior. This idea that AMD drivers are bad seems dead. It was definitely true years ago when I last experienced with AMD's cards, but my experience was rock solid switching my system over (make sure to use DDU to get a clean uninstall of Nvidia drivers before putting in your AMD card).
      Really the only leg up the Nvidia cards have right now are ray tracing. Which in most games is a barely noticeable improvement in visual quality. I'm sure as ray tracing tech matures, that gap will shrink and they'll be pretty much on par.
      I would say if you're thinking about it, at least from my experience, the switch was worth it. I got better performance for my $ and the experience was smooth and hassle-free.

    • The Chemist
      The Chemist Month ago

      2 core igpu can run gams amd has it in the pocket

  • Jawzzy
    Jawzzy 2 months ago +175

    You are absolutely right. They are just naming lower end cards to the next class and hope nobody would notice and in the same time increase the performance gap between classes, so they can later slot in the Ti versions. That 12GB 4080 is what the 4070 was supposed to be. I really think the blind greed got the better of them this time.
    We have insanely high prices, huge number of second hand cards flooding the market, now that the mining went tits up. Add to that the brutal increase in prices of things people actually need to survive and the insane power draw of these new cards. I think Nvidia will have a heck of a hard time moving the new cards this time around.
    A lot more people will pay attention to AMD's release this time around. AMD, if they have a good product, may be able to snatch a solid chunk of Nvidia's market share.

    • j4log
      j4log Month ago

      why look at it card half empty instead of card half full? did they nerf the 4080 10gb or did they actually jusf buff the 4080 12gb?
      no no, see, the 4080 10gb was correct afterall. what's actually happening is, in the 4080 12gb, they secretly swapped the chip with a 4090 chip. hence the performance increase.
      some of might say, no no, there's already a 4090 chip and they're in the 4090 cards. but see, in all the 4090 cards, they actually just secretly swapped the chip with a better performing 40100 chip. but they thought that it would be too confusing to have a 40100 product line, so they decided to just upgrade all the chips in all the other cards.
      so there you go, they're actually secretly boosting the 12gb card's performance out of the goodness of their hearts and only didn't tell anyone about it because they wanted it to be a surprise ;) way to be negative nates and nancies guys

    • Aced
      Aced Month ago

      i was an nvidia user since the 9xx series all the way up to 20xx series
      last year i custom build my PC using the 5800x as my CPU and the 6900XT as my GPU.....Best build i ever made in my entire life not gonna lie
      i have had zero issues neither with tempreture or hardware

    • Azorees
      Azorees Month ago

      @Jovan Lovett I too love the driver / software and hardware . but there is a limit to how many times you can screw over your loyal customers. I am definitely looking at what AMD bring out and if they have a card to give the 4xxx card a fight then i will be looking at changing over to AMD and changing out my 3080 10GB simply because 1 month after buying it they brough out a 12GB version of the same card , and they wonder why people get pissed art them :)

    • Rich
      Rich Month ago +2

      After looking at the cost vs performance, I'm already decided on swapping for my next build (in the process now, 6yr old rig). Going AMD all the way due to bang for buck and the fact they work together (AMD CPU and GPU) for better performance.
      Nvidia lost me with greed, heading into a recession and aint got no extra cash coming in, this will be the last build in a while so Nvidia and Intel, keep your high end expensive kit, rather save dollars and get the same if not better performance for less.

    • Jawzzy
      Jawzzy Month ago

      @Angel Demore Oh I know that. I got a boosted 1080TI, the week it was released, and replaced it late last year with a boosted EVGA 3080TI.
      This 3080TI had to be my worst buy. The performance is there, but it was expensive as hell and it was running really hot .
      My 1080TI Aorus master was my best GPU buy so far and I the way things are going, I doubt it's ever going to be matched by anything.

  • Mikey Bee
    Mikey Bee 2 months ago +130

    I'm actually excited to try out AMD's next GPU lineup for the first time

    • Sherrif
      Sherrif Month ago

      @Dan Yates So for people wondering why AMD tends to have more issues like this
      It's dev support, while devs DO work with AMD, pretty much every dev on the planet works with Nvidia. So what happens is every game gets worked on with Nvidia's help so the drivers release to support every games. Since some devs don't work with AMD you have situations where you experience these problems while other devs end up resolving the issues accidentally.
      Lets deconstruct the argument here though, as somebody who's used both cards:
      "Shocking Drivers" is not a complaint? I don't really understand what you're trying to say, beyond attempting to stack on to a short list of complaints. What's so shocking about them that you thought it was acceptable to file a complaint here?
      "Constant Crashes" this is a two way street, always has been. Nvidia crashes just as often as AMD, though maybe a specific game you were playing at the time caused more problems. Dark Souls 3 I remember none of my Nvidia friends could play it (at that time my Nvidia card couldn't even handle the game so I only had stories). When we get to these really complex graphics engines these cards are always going to have instability, hell your issue may even be that the specific card you have in the line up was never tested. To say that AMD crashes more often is a bit silly, both manufacturers crash it really just depends on what you're doing. (Also keep in mind that Nvidia doesn't always tell you when it crashes, where AMD trusts your intelligence enough to always ask you to send error reports whenever there was a problem).
      "Ridiculous Temps" is kinda weird honestly, my current AMD card runs at frame limits without breaking 70, but the Nvidia card at idle almost breaks 70. AMD cards have a history of running warmer, but people that bring this up tend to be people with a chip on their shoulder, not people with legitimate experience with the hardware.
      So "Never Again" you say? With two complaints that mean literally nothing and an issue that is going to depend on the games you're playing I would wager the truth is "Never did"

    • KremsonKhan
      KremsonKhan Month ago

      @Dan Yates Lol I completely understand you, u just need to find the right driver version! like for mine 5600xt, it is [RadeonSoftware-March2022] rest are all crashing or green lines!

    • Dan Yates
      Dan Yates Month ago +1

      I was excited when I brought a 5700xt on release day however it was the most infuriating experience. Shocking drivers, constant crashes, ridiculous temps. Never ever again will I buy an AMD gpu

    • KremsonKhan
      KremsonKhan Month ago +3

      I switched back to AMD in 2020 and last 2 years has been great! like very good deal,
      only the software needs help
      some versions are super op and some will break your pc but overall with the right software it works like a charm!

    • Terbojeezus
      Terbojeezus Month ago +5

      I took my first step away from Nvidia in years when the 6900xt was released and couldn't be happier. AMD has absolutely been delivering on their promises and I hope that holds true this November.

  • sleye
    sleye 2 months ago +111

    This is the cycle. Honestly, I think it's a sign that AMD is going to put out something good. Nvidia wants to extract as much wealth from the system before their dominance wanes which ironically is going to push more people to the other side

    • TrueSSJHavik
      TrueSSJHavik Month ago

      @Angel Demore Dude people make games for Console. Which is RADEON. What are you talking about?

    • Angel Demore
      Angel Demore Month ago

      "dominance wanes" is hopeful at best like dude amd on top? LOLLOOOOOOOL LOLOLOL OOOLOLOL thats actually hilarious. You are probably someon who things radeon is even comparable to current nvidia. literally no one makes games with drivers specifically for radeon not like they do nvidia.. like dude its not even in the same realm. Radeon is just showing up like they always did. Now Intel is in the market too.

    • Punktak
      Punktak Month ago +1

      @Freddy Figueroa Neither does NV. Very little excitement since the 3000 series.

    • Freddy Figueroa
      Freddy Figueroa Month ago +1

      I think is the opposite, AMD is bringing nothing exciting to the GPU market so NVIDIA might be able to get away with this

    • Mykle
      Mykle  Month ago +8

      Yea, I think you're right. Stockholders have too much power, it's evident in almost every market. Guys on the outside that just try to make thier money worth more. Yet, they destroy products.

  • George Anderson
    George Anderson 2 months ago +5847

    First they manipulate price of the cards, then they cut off almost half of the cores and still charge almost $1,000. It's time that we got some diversity in these cards. EVGA is looking a lot better today for their decision. If our leather coated friend wants to be Apple so much he should just go work for them. First we got shafted by the pandemic, then they shafted Us by selling most of the cards to Bitcoin miners, then it felt like a paper launch of the 3070 series, then they artificially inflate the price by controlling the supply, and now this. I don't care how many cool gadgets like resizable bar or Ray tracing they have, Nvidia right now looks like they don't know how to run their business, and they lost the mindset of their value customers, the ones who helped build them up to what they are today.

    • donald Norris
      donald Norris Month ago

      @r3mxd that's the shittiest justification ever. Go sit in the corner kid

    • CamboTechGaming
      CamboTechGaming Month ago

      @Andrew I've built gaming rigs and mining rigs, Nvidia features and drivers isn't any better than AMD at all. I would say they are pretty much the same for the 6000 series and 3000 series.

    • The Late Shift
      The Late Shift Month ago

      @Jamison Munn They will be sold out the moment the hit stores, people will buy this stuff no matter what. Then the we can't find the cards anywhere hate will start again.

    • DJ Golden
      DJ Golden Month ago

      @r3mxd such great advice

    • Jacques Esprit
      Jacques Esprit 2 months ago

      Exactly! You hit the nail!

  • Zangetsu
    Zangetsu Month ago +6

    Depends on the price but I'd stick with a 2080 Ti. In my opinion, it's still the best card from Nvidia at the moment, everything after that is just too inefficient, oversized, expensive and way too hot, which also results in them being loud ( air cooling ). Just that the 2080 Ti is a bit better than a 3070 Ti says it all in terms of progress. In addition, the prices go down drastically for the 2000 cards.

    • Maxatal
      Maxatal Month ago

      Yes. 3000 series was supposed to be the refined version of the 2000 series where they get a handle on power draw, heat and size, something we’ve seen before. (1000 series kind of being a refinement of Maxwell). But they pushed it further for performance. Then we were hoping the 4000 series would tone it down. Nope, more is merrier apparently.
      My 2070 Super will last me till 2025 at the least.

  • Rex Yoshimoto
    Rex Yoshimoto 2 months ago +105

    You got that right. I remember when Nvidia 970 was advertised as a 4gb card. Really was a 3.5 gb. Class action law suit followed.

    • derek400004
      derek400004 Month ago

      @narmale the need for speed is real lol

    • narmale
      narmale Month ago +1

      Your right, i dont need 12 vs 16... i need the 24gig

    • derek400004
      derek400004 Month ago +3

      @Rex Yoshimoto if Leatherman was half as reasonable as you, we wouldn't be having this conversation here today haha. Sad state of the industry

    • Rex Yoshimoto
      Rex Yoshimoto Month ago +1

      @derek400004 Point taken. I stand corrected.

    • derek400004
      derek400004 Month ago +8

      @Rex Yoshimoto my response wasn't to you, but to the other commenter who said Nvidia did give the GTX970 4gb ram, which was missing the point.

  • Aaron Simpson
    Aaron Simpson 2 months ago +33

    I was pretty stoked to buy a 4090 next month, but this is at least giving me second thoughts. Thanks for the insights!

    • Azorees
      Azorees Month ago +3

      i was looking at the 4080 , and if Jay didn't mention about the core count difference and the fact that the 12GB is basically the 4070 i would have gone for that as many games at the moment don't need more than 12GB at 4K . So i'm glad he mentioned that and saved me making a huge expensive mistake. i can only wish that 50 or even 75% of Nvidia users boycott them and move to AMD , i wonder what Nvidia would do then :)

    • Steve B
      Steve B 2 months ago

      Same here...

  • David Wagstaff
    David Wagstaff Month ago +4

    Absolutely SPOT ON! I would definitely assume the same GPU chip being used - if the product is labelled as 'RTX 4080' with only a difference in on-board memory. Having the 'Ti' moniker I would expect performance variation from either clock/cores or both.

    LAZY YELLOW DOG 2 months ago +5090

    The price hikes and loss of EVGA are major deterrents to me going RTX 4000.

    • CarnEvil
      CarnEvil Month ago

      wait why they lose EVGA?!?!?!

    • The_TEGGZ
      The_TEGGZ Month ago

      @Anthony Steible Gigabyte are like the Hyundai of gpu manufacturers. Cheaply made. Only thing worse is a zotac card

    • Anthony Steible
      Anthony Steible Month ago

      @The_TEGGZ what's wrong with other ones like gigabyte? I got a 3080 gigabyte and alot of gpu I see are gigabyte.

    • 4doorsmorewhors
      4doorsmorewhors Month ago

      @Charley Riddle why?

    • klaus pedersen
      klaus pedersen 2 months ago

      Lol 90% will still buy Nvidia no matter what Nvidia does. And Nvidia knows that. 🙂

  • Saffy7411
    Saffy7411 2 months ago +142

    Honestly, with the size of your channel, you shouldn't be worried about freebies from Nvidia. Having said that, I'm glad that you're not afraid to speak your mind and care enough to share with people exactly what is happening. Keep up the good work!

    • DeosPraetorian
      DeosPraetorian Month ago +6

      Yeah the thing is is not getting review samples is it hurts the performance of the videos because they will come out later instead of when the embargo lifts

    • Mokens
      Mokens Month ago +1


    • Moe N/A
      Moe N/A 2 months ago +3

      He joked in another video he'll probably end up having to buy the cards to review them

    • N Kenchington
      N Kenchington 2 months ago +7

      free bees. wasps $1.

    • Relaxing Perspectives
      Relaxing Perspectives 2 months ago +36

      It's the not the money. It's getting samples earlier so reviews can be published as soon as possible.

  • LunaticStrike
    LunaticStrike 2 months ago +14

    Quite interesting video. Honestly I had do replace two old computers with old GTX 970s of my children one year ago and ended up buying two notebooks with RTX3060 graphic chips.
    Meanwhile i still run my 980ti and I will stick to it as long as it runs. I am was planning to buy an RTX 3000 series but skipped it due to the insane pricing back then. Meanwhile I was hoping for the 4000 series but I will skip that too. I am currently planning to buy the next AMD once it’s released and completely avoid NVIDIA or maybe go for that new Intel ARC cards if they turn out to be good. I am not entirely sure

    • Ionstorm
      Ionstorm Month ago

      Intel ARC top line makes more sense than buying a 3060. Also laptops with RTX Cards you need to be very VERY careful of because they all have different TDP's.. honestly its a shambles.
      Say you buy a laptop with a RTX 3070 in it, there could be one with 100w TDP and one with 200w TDP which performans way better, but no one will tell you the TDP without closely looking at the specs (that alone should be illegal imo) anyway, yeah just care.

  • Joshua S
    Joshua S 2 months ago +2

    I feel like the 4080 12GB should of been marketed off as the 4070TI and released with the TI series, releasing it under the same name as the better spec 3080 does seem like a scheme to trick new and unsuspecting buyers

    TROLLDATSHIT Yeah You Month ago +9

    I'm mostly happy with my 1080 which is why I can wait for what AMD will show

  • Giant Midget
    Giant Midget 2 months ago +1395

    I remember paying $400 for my 1070, those were the days. Even then i thought that was ridiculous. Nvidia is spitting on us these days.

    • Ionstorm
      Ionstorm Month ago +1

      I remembering paying £650 for my RTX 2080 and crying with buyers remorse that it was pretty much the exact same performance as a 1080 Ti, but cost twice as much at the time.... and DLSS actually lowered my FPS in Battlefield V.
      Honestly, the RTX 3090 I bought made me feel guilty buying it, cost so much - £1605 for the Suprim X... but very luckily nearly a year later I manged to sell it and get a 6700XT for literally £150 after the profit from the 3090.
      I'd say now, NVIDIA are death to me. They'd better cut prices and stop trying to trick customers with their BS marketing mind games if they want loyalty.

    • JPker
      JPker  Month ago

      @Cuppat same, im on 1070
      I dont know if I can have a 30 series in my system because my power supply would start coughig...

    • @Resident_41
      @Resident_41 Month ago

      @Mirage bootlicker detected

    • LioVir
      LioVir 2 months ago

      bought an used 1070 back in 2019 for 200bucks still going strong.

    • El tumbador de agujas.
      El tumbador de agujas. 2 months ago

      i paid less than 400€ for mine that i still use it, now the top of the line card cost more thatn my entire pc when i builded in 2017.

  • Suilujz
    Suilujz Month ago +7

    RTX has been such a farce all the way through, turing to ampere was the only good upgrade and even that was mostly achieved with higher power consumption. The main selling point of the 4000-series is DLSS 3 with frame interpolation, it's awfully convenient when your 4K benchmarks are running at an unspecified resolution and you have artificial frames counted in.
    They haven't told us shit about raw rasterization performance because they know they can just focus on optimizing their AI gimmicks rather than improving their architecture as much as they can.
    In the end they can do what they've always done and make their tech exclusive and not usable by AMD, then after they stop pushing it the feature just dies, remember PhysX or SLI?

    • Suilujz
      Suilujz Month ago

      @Sherrif AMD drops technology that runs even on intel hd graphics* nvidia drops hardware dependent shit which they can't decide if they'll half-assedly support on older gpus of their own (support for graphically cut down version of raytracing running 15fps on a gtx 1080?! wow!)

    • Sherrif
      Sherrif Month ago +1

      It's honestly a reason I just respect AMD cards over NVidia, the features they add are neat, but most of the features they "lock" to their cards aren't using special cores that accelerate that specific process. FSR was a gift to people that had older cards because the technology wasn't "locked" to hardware in an attempt to sell it, sure ray tracing is a neat tech and this might be the first generation to have usable ray tracing, and since those use specific hardware that has to be delivered it just makes sense to be "locked"
      I really do enjoy watching AMD flex on Nvidia like this, DLSS comes out and "wow you need the newest card" AMD just drops off a technology that runs on Nvidia cards basically calling them out on their bullshit. I'm not a fan boy but it's fun to see that kind of exclusive marketing get absolutely flexed on.

  • tri / Xithar
    tri / Xithar Month ago +4

    This is the main reason why I buy only after reading not only initial reviews, but also the ones a month later, as well as waiting for enough user benchmarks to see how well the cards perform in RL.
    In regard of buying the fastest, most expensive card: I did this error once with the 1080. 3 month later the 2070 came out with mostly more performance in games and only around 2/3 of what I paid. So I did swear myself never again, usually now I'm not really looking at the numbers in the names much. Atm. i have a 2080 Super and a 3060 Ti (had to buy the 3060Ti when the 2080S had to be repaired after 1,5 years because of constant overheating). The 2080S performs a very small bit better, but the 3060 Ti uses less power, doesn't produce nearly as much (2/3) heat and runs much quieter, which is why this is now the card I'll be using till I get the itch to upgrade it again - which with such prices might need a looong time.

  • weeschwee
    weeschwee 2 months ago +2

    I've always bought Nvidia, but those prices seem way high. I think my 1070 was around $430 and that's the most I've spent on a graphics card. I feel like price vs performance should get better as the industry advances, not worse.

  • addusernamehere
    addusernamehere 2 months ago +9

    Great consumer awareness video, Jay! Nice work!

  • World Kat
    World Kat 2 months ago +931

    The irony is it is clear NVIDIA did this to try not get backlash for a 900 USD 4070 but in the end only made people even more angry.

    • Yellowblanka
      Yellowblanka 2 months ago

      @non sense statistically, they won’t sell anywhere near as many as previous midrange gpus, sadly though, they just have to sell enough to justify the pricing. It seems nvidia is leaning into the childless e-peener market full tilt.

    • OutOfNameIdeas
      OutOfNameIdeas 2 months ago +1

      @John Henderson no. The 4070 is the 12gb 3080. The "4070" you're referring to is the 4060.

    • Eric Wagstaff
      Eric Wagstaff 2 months ago

      @John Henderson i think much later or depending availability and how low pricing on 3090/3080 goes because price to performance may not be good on anything lower then 4080 16gb when cheap 3090 or 3080 ti are plentiful. We have to see actual performance numbers but i bet that why 4080 12gb wasn’t named a 4070 like it should have been.

    • Venox
      Venox 2 months ago

      @John Henderson You seem to have missed the point. Fucking read my guy.

    • Arden Stellaris
      Arden Stellaris 2 months ago +2

      Almost $1k for a mid-tier, 70-series card is absolutely insane

  • William Heim Photography
    William Heim Photography 2 months ago +4

    I've been waiting over two years now to upgrade from my 1080ti and haven't yet due to crazy cost and no availability. Just recently, I've seen the 3090ti in stock with a price drop to $1099 on Best Buy but didn't purchase because I was hoping to see the price drop even further. Now I'm kicking myself because the cards are once again "out of stock". With the announcement of these new cards coming (eventually) what would you think my best upgrade path is? I have quad QHD monitors (3 setup as linked triples) and use my PC for 4K editing and iRacing. I'm hoping for a significant performance increase over the 1080ti for these applications but don't want to overpay due to these "farmers" driving prices up. You videos are very informative but most of the information is way over my head. I just wish someone could tell me the best bang for the buck and where I can buy, without getting gouged on price. Thank you

  • Kevin Galloway
    Kevin Galloway 2 months ago

    The disparity in the number of cores was the first thing I noticed, especially when compared to the previous generation. Sure, the clocks are almost twice as fast, but with a significantly reduced number of cores, I’m not impressed. After seeing the specs, along with the pricing and availability of the 3080Ti right now, I picked up the current gen.

  • Mojo
    Mojo 2 months ago +2

    When it comes to graphics card I've come back and forth between AMD and Nvidia so I'm really not concerned about what's called what since I'm going to get whatever is best from a price to performance point of view. And honestly, the high end of graphics card doesn't really tickle my fancy. These days I'm most impressed when I see what my Steam Deck is capable of with a measly 15W tdp.

  • bassx101
    bassx101 2 months ago +11

    Micro Center is an excellent place to shop for electronics and pc builder enthusiast, so glad you have these guys as a sponsor. Good co op Jay 👍

    • Andrew Muse
      Andrew Muse Month ago

      I got all my parts at micro center to build my pc

  • Grant Mckenzie
    Grant Mckenzie 2 months ago +4236

    The more I read into Nvidia, the sleazier their business seems. No wonder evga split ties

    • Dregs
      Dregs Month ago

      @Ionstorm I believe that

    • Ionstorm
      Ionstorm Month ago +1

      @Dregs Can assure you there was a clusterF of issues with 5700XT's and drivers, but since going to a 6700XT after selling my 3090, no issues at all and hey! they even fixed Freesync when above 144hz! bonus. AMD's software is WAY Superior in every way.

    • Bricyn Jameson
      Bricyn Jameson 2 months ago

      Ever since I got into PC building (2014) I've learned more and more that Nvidia is right up there with Apple in flipping off their customers. I do my best to remain unbiased and just look at the products, but it's getting even harder to not just go AMD out of principle.

    • Free Smith
      Free Smith 2 months ago

      Nvidia sucks and has been profiting off crypto whole destroying the planet.

    • Dario Mladenovski
      Dario Mladenovski 2 months ago

      EVGA,Tesla and Apple all stopped doing bussiness with them

  • Nick Cucchiara
    Nick Cucchiara Month ago

    Thank you, great video and subject matter. I think it's pretty good timing for me building a creator pc. Seeing the 4090 should mean the 3090 ti 24mb card price should take a dive as I would be happy with this card when required for workflow supplementing an i9 k. If you want a good nerd laugh this is replacing an i7 875k Lynnfield which has lasted 12 years now. Literally the only two things replaced was a GPU fan and my HDD summoned the black screen of death.
    You're absolutely correct about including all the spec's on the box which in itself should roll down the ladder straight to web sales including this information. Though for someone like me out of the loop it's more beneficial for someone like you to release the Kraken to the public for an overview being on point.

  • Steve Christo
    Steve Christo Month ago +8

    Saw this coming when they started selling so many cards during the pandemic that they couldn't keep them on shelves, even at double or triple MSRP. They made the stockholders really happy and now they have to find some way to keep making the same amount money or the stockholders get mad. A big change is coming, whether it's bad or good I don't know.

    • Basement Dweller
      Basement Dweller Month ago +2

      Artificial growth can't be maintained forever, I get what you're saying. Their chart wasn't organic, they pumped and at some stage now they're going to dump because they were manipulating their own market.

  • Grisu19840
    Grisu19840 Month ago +2

    I'm done with Nvidia, I already switched to Ryzen over Intel years ago, and now, depending on what AMD does with it's upcoming cards, I'll go full Team Red.

  • Dustin Hoffmann
    Dustin Hoffmann 2 months ago +4

    I totally agree with you on this Jay, Nvidia makes good products, but the company's morals are really fucked up! I stopped buying Nvidia gpu's many years ago, and have only gone with AMD cards since then, and I think it was a great move for me. I gamed for over 23 years since the 80's, but I still fire up a game once in awhile. I just can't believe people are so obsessed with having the best gpu out there that they're willing to spend $1000 on a graphics card, that's just insane to me lol! I'm glad I have no interest in high end gpu's, and these 40 series cards are not only expensive, but BIG, power hungry and give off a ton of heat.... no thanks! It's too bad that people like Jensen and all the assholes in charge at Nvidia will do such shady AF things just to make a buck. I feel bad for the workers that do all the hard work there like the engineers and technical staff, logistics people and stuff that have to deal with the 'big wigs' high up and their BS. I won't buy an Nvidia card again until they change their behavior, but at the same time, I wish people weren't so desperate to have an Nvidia gpu that they're willing to sell their soul just to get one, it's really too bad.
    There's more important things in life than video games guys!

    • Azorees
      Azorees Month ago +1

      I agree 100% with Jay on this too, I remember when a PC build to compete with current and next gen consoles was only around double or 2.5 times the price, now with the new pricing on Nvidia cards that price gap has jumped to near 4x the price. no wonder console players are laughing at PC gamers. look at it this way for the price of a 4xxxx card i can buy a PS5 and the Xbox and probably still have a little change left over.

  • awesomejakex !
    awesomejakex ! 2 months ago +1218

    Nvidia’s marketing is completely predatory and they are fully aware that some consumers just won’t know any better and will get duped by the 4080’s naming scheme

    • John Doe
      John Doe 2 months ago

      @Roger Roger 10-4 Hello fellow Maxwell GPU chad. My 970 still maxing out just about every game I have at 1080p. EVGA 970 FTW+ edition. Single slot cooler, compare that to the new shit.

    • deviildogg1
      deviildogg1 2 months ago

      They are using the old Intel strategy 😂

    • Ray Tillman
      Ray Tillman 2 months ago

      @Roger Roger 10-4 I'm on my second card since my 980ti, i seriously don't know why I bothered except for VR needs power!

    • Nick Sanders
      Nick Sanders 2 months ago

      The specifications don't mean anything anyway. Wait for the cards to come out, look at how many FPS each card gives you for the games you play or want to play, then choose based on that. Naming means nothing, they could call it anything, it wont change the performance.

    • LoCk3H
      LoCk3H 2 months ago +2

      @Damien Tech tossa...

  • KLTechNerd
    KLTechNerd 2 months ago +6

    I have always used N Vidia. This is the kind of stuff that makes me consider switching to AMD. I hope they step up.
    Thanks JayZ2cents.

    • Azorees
      Azorees Month ago

      same here been with Nvidia for 20+ years and now looking at swapping over to AMD , i am already looking at upgrading from my 3080 so if AMD brings out a card that has similar speeds and performance to the 4080 ( 16GB ) i will be looking at that card.

  • Kyle Szalontay
    Kyle Szalontay Month ago

    This really just makes me want to see what AMD and Intel can do.

  • Bits BFG
    Bits BFG 2 months ago +2

    We feel the same. It's come a long way since the NVIDIA SLI days and we wouldn't be surprised if the 4090 is shaved down to make the 4080 Ti fit and just name is that.

  • Daniel M
    Daniel M 2 months ago +1

    I've had bad experiences with AMD in the past (cpu not gpu) so I'm hesitant to jump ship. it will depends on a few factors for me if I will buy this:
    1. actual increase over 3080, if it's actually around 2x then to me it's a good value
    2. Amd power and price ratio. I may not like them as much, but if it's comparable power that is a few hundred less, even I can't deny the better value
    3.Avaliablity, more than likely one is going to be easier to get, and from how people are talking up AMD I'm guessing nvidia will be easier for me to find
    However, the naming scheme is a bit scummy. one could argue that it is lessened with more people being able to analyze the specs, but that's still weak.

  • Kaytrim
    Kaytrim 2 months ago +1909

    Nivida is a money grubbing company. I am 100% behind you on this Jay. The cards are great but the pricing and naming games are only to their benefit. I am glad and sad that EVGA broke up with them.

    • Veritas et Aequitas
      Veritas et Aequitas 2 months ago

      I hope EVGA survives. They used to be almost as good of a company as BFG. 'Member BFG and their unlimited, no questions asked, TRANSFERRABLE lifetime warranty?

    • jankerson1
      jankerson1 2 months ago

      They are all money grabbing Corporations, that's what they do and their ONLY objective is to make more money at a higher profit margin for a bigger bottom line. THAT IS THE REALITY OF WHAT THEY DO... Wake up people corporations are in the end the worst and most evil thing on the planet currently.

    • Richard McBeef
      Richard McBeef 2 months ago

      The boot lickers in this thread 😂

    • The Echelon
      The Echelon 2 months ago

      Have you seen the new cards though? They're massive shit bricks with the tdp of a small heater

    • thegreatbeavers
      thegreatbeavers 2 months ago +1

      And to think People out there thinks Jay is a scumbag for making videos about this situation. Jay is willing to admit his faults and actually think of the little guy.
      I won't be surprised there's people out there just to hate because they're trying to gain something from doing that.

  • Ash3s Gaming
    Ash3s Gaming 2 months ago +1

    So my theory about the cuda core differences on the 4080 12gb vs 16gb is the tensor core and RT core count. The 12gb may have a higher RT Core count to make up for the lack of memory and cuda cores to get the same performance with DLSS out of the card so raw performance is lower that the 16gb but running DLSS is equal in performance to the 16gb while running DLSS on both so idk we will see when it hits

  • 1Dshot
    1Dshot Month ago

    One of my pet peeves is companies with confusing naming conventions. MS & Apple have this issue with their Xbox consoles and ipad/iphones respectively. So many variants make it confusing and difficult to really tell what you need, and it slows down the buying process anyway 'cause you have to parse all the differences between them.

  • Daniel Mullins
    Daniel Mullins 2 months ago +1

    I'm still rocking a 1080, so I'll probably just buy a 3080 whenever I do upgrade. I've held out mainly due to pricing, and because the 2080 really didn't impress me. To be fair though, the 1080 (OC liquid cooled) is only now starting to show its age with AAA titles, at reasonable settings (2k @ 75hz w/ mid-high quality). If it wasn't for Forza 5 lag spikes, might not upgrade at all.

  • Joseph B
    Joseph B Month ago

    Seeing their benchmarks now, I'd say they got away with it just fine :)

  • James K
    James K 2 months ago +281

    You know NVIDIA has become the enemy of the consumer when they announce "dropping GPU prices are a thing of the past" after years of dramatic price *inflation*.

    • yes yes
      yes yes 2 months ago

      @mcl48YT They are, probably out of desparation, or because they are crypto miners that have somehow had an idea in their head that if they purchase one hundred 3090s and run them in rigs mining crypto for 2 years they will turn a profit. You must realize, these were both being bought in bulk by cryptominers, and scalpers, both looking to make a profit, individuals purchasing 100+ 3090s or other 3000 variant cards was happening a lot.

    • mcl48YT
      mcl48YT 2 months ago

      @yes yes Well are people paying 2 grand? That's the only question that matters. The cards were clearly underpriced before if they're still flying off the shelves.

    • James K
      James K 2 months ago

      @Jason Perry A glut in supply, a reduction in demand -- price increases are the natural corollary.
      Welcome to Opposite Day.

    • yes yes
      yes yes 2 months ago +1

      @Jason Perry the price inflation was from individuals purchasing bulk goods to mine crypto, then people started buying the cards in bulk to sell at a ridiculous premium. The same thing happened to GTX 10 series, Crypto died down and so RTX 20 series wasn't as badly affected, then it happened to RTX 3000, but Nvidia did not set their prices higher in anticipation of crypto miners buying the cards, they set the MSRPs higher because they wanted to, the scalped prices were comical with RTX 3000, but the MSRPs that were intended to be their price were not much better when you compare it to previous generations. Back in my day, getting a top of the line consumer flagship card was no more expensive than 600 bucks. Nowadays its 2 grand?

    • Jason Perry
      Jason Perry 2 months ago +2

      I feel like a few people here don't understand the concept of supply and demand from ECON 101.

  • Equin Starbeat
    Equin Starbeat 2 months ago +8

    The best part about all of this, is that even with these prices, board partners may not even be making a profit on these cards.

    Your STATISTACS Month ago

    I'm really certain next go round I'll be going with AMD. I suckered into getting a 30 series cause I knew someone that worked at a bestbuy so he hooked me up and I stupidly impulse bought. Going to be a while until I get a new card cause I just got this but I'm pretty sure Imma be getting AMD as long as they keep playing their cards right.

  • James Huggins
    James Huggins 2 months ago

    My workstation has been AMD for years. Looks like it's time for the gaming setup to make the swap. I almost refused to buy a 20/30 series due to that whole debacle, EVGA's amazing warranties were honestly the only thing keeping me in. Team red it is. I have to wonder if the EVGA situation will knock some sense into NVIDIA or force them to start behaving. We will see.

  • Clark Meyer
    Clark Meyer 2 months ago +1

    I agree. I don't know all too much about computer hardware but I get told what specs to look out for but the boxes never show the details I'm looking for. It definitely should be a requirement especially when you are making a huge investment. Its like being told there is a house up for sale but you aren't allowed to know the details except for seeing the front of the house.

  • Eric Bonow
    Eric Bonow 2 months ago +724

    The EVGA split was a big deciding factor for me to stay away and this deceptive naming seems to be in alignment with their business practices.

    • PAPA Smurf
      PAPA Smurf 2 months ago

      @Callandor99 Wrong, EVGA was one of the biggest sellers of GPUs and would have grossed a lot more on this 4,000 series but for lots of reasons they just dont want to be under NVidias thumb. I think they got an ACE up there sleeeve and they will be back in some other way to compete. I hope

    • Callandor99
      Callandor99 2 months ago

      EVGA only backed out because they already made a shit ton in the crypto market and now its dead so they are out.

    • PAPA Smurf
      PAPA Smurf 2 months ago

      Your probably already standing in line a Micro Center because your a super NVidia fan boy.

    • Freestyle
      Freestyle 2 months ago +1

      @Crazy Dude cept EVGA would have no idea about the prices genius
      nice try though, trying to fish for likes by posting something without doing an ounce of research

    • s1mph0ny
      s1mph0ny 2 months ago

      EVGA created their own problems, they've been selling $300+ video cards without backplates which is insane.

    SOLiiTUDE GAMING Month ago

    I was gonna buy a 4080 16GB, but honestly at this point I think I will wait for AMD to show off their cards. I have a feeling they will be on par with raytracing performance, better at rasterized performance, and cheaper.

  • Montgomery Bonner
    Montgomery Bonner Month ago

    This most likely will work out in the wash to some degree. Next year I will revisit and see what has changed. The ATX 3.0 spec and connectors to me is more important.

  • Matthew Frank
    Matthew Frank 2 months ago +2

    Yep... I agree Jay. Not much worse than a company with good products to be out of touch with the average consumer. I am only guessing here, but I believe Nvidia are pricing a lot of their customers out of the GPU market. I skipped several generations because of prices, and it appears I will do it again.

  • Dean Philip
    Dean Philip 2 months ago

    Back when I was making my first build and had no idea about GPUs, i also made the mistake of getting a 3gb 1060. I was so pissed when I found out what nvidia was doing. As a new buyer this is a really annoying and unfortunate to happen

  • Seth Hoffman
    Seth Hoffman Month ago

    My next rig will be AMD CPU and GPU, no doubt. I've been on Intel and NVIDIA since the Core architecture. I used to prefer NVIDIA over ATI\AMD, but the recent events have soured me on them. Back in the day I was an AMD and ATI fan in the Athlon and Radeon days. It will be a nice return to form.

  • Jariel Rotta
    Jariel Rotta 2 months ago

    I'm going AMD for my next card. My 1060 6GB is a great card and still holds for my usage, but NVIDIA is doing a lot of stuff that bothers me...

  • nrosario
    nrosario 2 months ago

    Totally agree with you, Nvidia is selling what should have been called a 4070 as a 4080. Hope consumer understand this and choose wisely

  • DaMango
    DaMango 2 months ago +1

    I have a RTX 2070 Super, and was considering finally upgrading to a RTX 4080, but after watching this...
    Now I consider buying a RTX 3080 /3080ti instead, since they gone down like 40% in price.
    Want to upgrade because I'm still at 1080p currently, but wanted to go 4k ultra HD in the future.
    Any advice Jay?

  • Blake Griffin
    Blake Griffin 2 months ago +472

    I love Nvidia but after losing EVGA, I've started to open my eyes on what this company has become and the more I think about it, the more I want to go to AMD. I will wait for November!

    • Rex LR
      Rex LR 2 months ago

      @marty_st3 Did you just say AMD makes good drivers? hahahahahahahahahahahahahahahahahahahahahahahahahaha

    • Ben Wilkinson
      Ben Wilkinson 2 months ago

      I've actually owned more AMD cards than Nvidia cards. AMD screwed me on every card and I kept coming back because of the "value". I'm all for "screw the system" but Nvidia has always just worked for me.

    • LinuxMaster9
      LinuxMaster9 2 months ago

      I recommend Sapphire as your EVGA replacement on the AMD side assuming EVGA doesnt make AMD GPUs in the future.

    • Hudson Hamman
      Hudson Hamman 2 months ago

      Bruh, it's been this way since before the pandemic.

    • Methamphetameme McMeth
      Methamphetameme McMeth 2 months ago

      @PAPA Smurf Name checks out.

  • Holy Beard
    Holy Beard 2 months ago

    I like that you made a reference to the GTX 970 at the end of the video. It really seems like they're doing something similar to what they did back when the 900 series came out.
    It's a real shame that they're doing shady practices like this and turning away potential customers. I think we all know that Nvidia is a corporate business and that their main goal is to make profit but when you get to the point they are at right now you have to wonder if they ever gave a damn about the people who are spending money on their products 🥲

  • BlackPantherFTW
    BlackPantherFTW Month ago

    The msrp is insane. 1100 for a 4080? A new power supply too?! We as consumers need the competition to step it up

  • Kuro
    Kuro 2 months ago

    For me personally I'm going to wait for RTX 40 series to release and hopefully this means RTX 30 series prices will go down in my country because they sell RTX3080 for $1050
    So it depends if they're the same price or not that huge difference price then i will go with with RTX4080

  • TheKFDCase
    TheKFDCase 2 months ago +3

    Keep fighting the good fight, Jay. Once more you show yourself to be a decent man. Bravo.
    I purchased a RTX 3080 10GB in early August; it's a massive upgrade from the venerable GTX 980 work-horse I'd been using for 7 years. There was an obscenely good deal on it (for this region) - it ended up costing just ca. 700-720 USD (taking 10+% inflation into account,) which is a steal considering it's now selling for 930 USD, and that's with a 'normal' discount! As Jay says, Nvidia makes great products - it's the bean counters that crap all over us from on high.
    I was going to wait for AMD's RDNA 3 offering, and I am very curious as to how it performs. However, I do not have much faith that any of the GPUs in the upcoming generation(s) will be cheap/cheaper. Hence why I leaped at the unexpectedly good deal.
    Additionally, if this is the pricing of GPUs going forward, then as I stated during the pandemic, I'm done with high-end PC building. It'll be consoles and possibly more modest/medium-setting PCs going forward. If the game developers do not up the ante, medium settings could prove to be stunning on their own.

  • Benjamin Oechsli
    Benjamin Oechsli 2 months ago +678

    Another difference that isn't mentioned on that chart: memory bus width. You can't just lop 4 GB of VRAM off and call it a day, you have to change the bus, too (something about each bus width having only two memory configs it can handle, and they're multiples of each other).
    So the "4080 12GB" not only has less memory, it has a smaller bus (256-bit down to 192-bit). The performance gap between these two is going to be _much_ bigger than the box would imply. I agree, Jay, the FTC needs to get on this.
    Great video, sir. The rumor mill says AMD is going to have a very competitive product this year. If they don't go nuts with the pricing, they'll be the winners in customers' minds even if they lose the performance crown by a few %.

    • Pudding Simon
      Pudding Simon 2 months ago

      Yea I think they had the same stuff with the 3060 that I bought, the 12GB vram version.

    • Hextobyte Protogen
      Hextobyte Protogen 2 months ago

      @Silly Olive You still want to support nvidia and their blatantly questionable actions?

    • Bram Schoenmakers
      Bram Schoenmakers 2 months ago

      @Silly Olive is it inferior though? If bang for buck matters most you probably are going amd.

    • satan
      satan 2 months ago +1

      @Silly Olive ok

    • Silly Olive
      Silly Olive 2 months ago

      I don’t understand why anyone would to stop spend money on an inferior GPU by AMD. Get a job you bums

  • RN1441
    RN1441 2 months ago

    I very much hope that the AMD chiplet GPUs allow 'bang for buck' to come back to the GPU market. I already had to come down from the '80' series in the 1080 to a 3060ti due to not really being willing to pay more for a GPU than a console costs.

  • souledgar
    souledgar Month ago

    I've never even considered AMD before the 40s, and I feel like I have no choice but to wait and see what their counteroffering will be. I doubt I'm the only one here like this. AMD has a golden window of opportunity here to finally take a significant chunk of nVidia's lead in market share.

  • KnapfordMaster98
    KnapfordMaster98 2 months ago +10

    Just got an MSI 3080ti, very happy with it. I work in Blender with 3D rendering so I have a lot of incentive to stick with Nvidia cards. I expect this 3080 to suit me for a LONG time, hopefully by the time I need something more they'll have gotten their act together.

    • Mathias
      Mathias Month ago +1

      @jazzochannel word

    • KnapfordMaster98
      KnapfordMaster98 Month ago +1

      @jazzochannel Agreed, and I will keep that in mind going forward.

    • jazzochannel
      jazzochannel Month ago +1

      My point is: _saying_ that we don't like Nvidia's practices is meaningless if we _buy_ their products anyway.

    • jazzochannel
      jazzochannel Month ago

      ​@KnapfordMaster98 I understand your position, but it's not worth much. Don't take what I said personally. I bought the 2080 ti myself. The most overpriced product I ever bought, and I don't even work with graphics or AI. Nvidia has had criticism thrown their way for many years, the recent stuff is more of the same behavior, but the consumers don't care because there is nothing else on the market.

    • KnapfordMaster98
      KnapfordMaster98 Month ago

      @jazzochannel I actually bought my 3080 before any of this came to light (afaik). What I'm saying is, I like their product but I do not like the direction they're going. And I hope they get their act together so that their product is worth supporting down the line. If not, I will look elsewhere for the next best option, primarily keeping Blender rendering in mind.

  • avja MASH
    avja MASH Month ago

    I was gaming on a 2017 system, i7 - 7700k, 1080 ti, 32GB 2400mhz ram, I was waiting to upgrade, and hearing the rumours about power, heat, issues, pricing, so on, I decided to jump on the last gen's outgoing pricing and build literally the week before the new Ryzens dropped. I am not on a 5900X and 3080 ti, and glad i did it. Everything I have is tried and true, no worries no issue.

  • Whiteshadow
    Whiteshadow 2 months ago +534

    Can I just say this: If a company gets pissed off at your channel for speaking the truth, then that company isn’t worth our money. If nvidia can’t fix their issues, then they don’t really deserve the business that you can potentially give them.

    • That One Toaster
      That One Toaster 2 months ago

      @Bailey The Sleepless I wouldn't know, you're the one licking Jay's feet instead of using your brain.

    • Bailey The Sleepless
      Bailey The Sleepless 2 months ago

      @That One Toaster How's that boot taste?

    • Yellowblanka
      Yellowblanka 2 months ago

      @That One Toaster Lol, when was the last time you went out in the real world? Pro-Tip: it's considered rude to not look somebody in the eyes when you talk to them.

    • Hokhmah
      Hokhmah 2 months ago

      @neoasura First, most of 3xxx cards were bought by miners and scalpers, not gamers. With mining fairly dead and initial prices so high that scalpers won't sell shit, only the third group is left for buying 4xxx cards. That's the reason why you see the flooding of 3000 series on the used GPU market. Second, I know you made a hyberpole to show your point but AMD has about 20% market share with RDNA 2 (and not 1:1000). They already delivered on the rasterizing part this generation (having better FPS per watt and dollar compared to Ampere) and if it gets even better, then Nvidia can only shine with DLSS and raytracing.
      At least for me I'm only interested in native 4K/60fps cause it's time to say goodbye to 1080p. Don't want this AI upscaling DLSS or even FSR bullshit. So the only trump card that Nvidia has left is their technological and performance advance in raytracing. I can live without it if I can push everything else to High/Ultra while still maintaining my 4K/60fps. I'm fairly sure there are lots of other customers with the same mindset. So if let's say a RX 7800 (that most likely will again have at least 16GB VRAM with 256-bit memory bus and maybe even better working Infinity Cache) will deliver all that while staying at 250W TDP for around $600-700, then Nvidia can kiss my ass.

    • That One Toaster
      That One Toaster 2 months ago

      @Yellowblanka I'm sorry reality offends you.

  • hueo far
    hueo far Month ago

    You got that right. I remember when Nvidia 970 was advertised as a 4gb card. Really was a 3.5 gb. Class action law suit followed.

  • Fater Steve
    Fater Steve Month ago +1

    I bought amd for my recent build. NVIDIA annoyed me to the point that I was willing to have worse drivers to avoid them.

  • Spy_
    Spy_ 2 months ago

    Waiting till November is always my thought whenever Nvidia releases something...

  • Zappy
    Zappy 2 months ago

    So happy I built my PC 3 months ago. Prices were adjusting and I can skip this mess.
    Issue is though is that customers are fical creatures and it's shown time and time again that NVIDIA are right, and people will pay because customers are their own worst enemy.

  • SightedNZ
    SightedNZ 2 months ago +316

    You're dead right Jay. I've been building computers for 20 years now and I've never seen the industry in such a F#%ked state. I just watched Steve's video on the partner cards and they're absolutely ridiculous. The size of the cards, the price of the cards and the superficial/shallow nature of all the marketing is disgusting.
    I'm definitely skipping the 40XX series cards and will run my 1080 Ti until it dies.

    • George Marandianos
      George Marandianos Month ago

      @xXAlmdudlerXx exactly, sometimes GPUs last for DECADES! my 8800GTX stills works perfectly and silent as a mouse....on a desktop pc with w11 pro no less! So, when people say "i will run my GPU until it dies", they should have in mind that this might be much longer that they're expecting...

    • Nabber
      Nabber 2 months ago

      @xXAlmdudlerXx They definitely do, although I've noticed that older cards seem to be more stable and have pretty high longevity. I ran an RX480 from the day it came out up until a few weeks ago when I built a new 3080 system. I only replaced it because I wanted more performance, but the 480 is still fantastic imo.

    • Adam Dunne
      Adam Dunne 2 months ago

      @Rob Over reaction. You can still get very good cards for reasonable prices, just not the brand new ones.

    • Veritas et Aequitas
      Veritas et Aequitas 2 months ago

      This is what communism always brings. Just like the great depression and the dozens of American motor vehicle brands gobbled up, the smaller businesses are being forced to sell to the uberglobalconglomerates who benefit.

    • melvin rosa
      melvin rosa 2 months ago +1

      @Prince Daniel LMAO.

  • jadknew
    jadknew 2 months ago

    I'm in the lucky position where my current hardware is more than sufficient. This 40 series seems so detached from reality, if this is the trend going forward from NVIDIA then it's a major concern. Hopefully AMD or Intel can step up to the plate, if not then 2nd hand market for older cards becomes infinitely more attractive.

  • John Wick
    John Wick 2 months ago

    I'm looking at upgrading soon and I think I'm gonna try out AMD for once. I've always had NVIDIA stuff, but all this shit they are doing is just pushing me away.

  • Arthur
    Arthur 2 months ago +1

    I ordered a 3080 for €720, but after almost a year of waiting it got canceled. I had bought a Reverb G2 almost at the same time as the 3080, collecting dust because I didn't have the card to run it. So I settled for a 3080ti for €1200. Way too much money, but not being able to use the Reverb G2 would also have been a waste of money.
    But because of these high prices I will certainly skip two and mabye even three generations of cards.

  • 純ジュン
    純ジュン Month ago

    Not me!! I'll be checking reviews online before I crack open my wallet. And certainly not to purchase inferior hardware. Thank you for bringing this to our attention.
    Is there a chance that hardware is starting to plateau in terms of peak performance? Things cannot continue to improve into infinity unless they discover and market products hardware capable of quantum computing

  • DocBrewskie
    DocBrewskie 2 months ago +717

    Nvidia is so focused on high end and AI that average gamers are left in the dust. If AMD plays it right they could scoop up all of us gamers who aren’t going to spend $800-900 on just a gpu.

    • KeinNiemand
      KeinNiemand 2 months ago

      @Raven.Bloodrot in that case amd will gain a monopoly and start doing the same shit. You should buy whatever the best performance/$ in your budget regardless of company. Belive me if amd had a monopoly on consumer gpu they woudn't be acting much better then nvidia is now.

    • Luke Walsh
      Luke Walsh 2 months ago +1

      @Rigoberto Ramos Must be new if you think their "budget" cards won't have massive price increase.

    • Capt T Crunch
      Capt T Crunch 2 months ago

      Not to mention, 2.5 to 3 years from now. They will have a 5000s GPU that they claim is 2 - 4x faster than the 40s GPUs for the same price. Making your purchase appear to be pointless. That is why I kept my EVGA GTX 1080 Classified for so many years and I'm only getting ready to upgrade now. I firmly do not believe in upgrading your GPU/CPU every single generation. Simply buy high end parts and they will last you multiple generations. You really cannot even see a significant performance increase from the last generation high end models to the next. It usually take 2 - 3 generations to see a huge leap in performance.

    • Capt T Crunch
      Capt T Crunch 2 months ago

      True. Some average gamers spend the price of the 4090 (1599.99) on a full PC build alone. Hell, you can probably build a decent mid-range performing gaming PC with (900.00).

    • Binoby
      Binoby 2 months ago

      ​@Igor Stoso in the US you can get a 3060 for 400+, an rx 6650 xt is 350 and it performs like a 3060 ti

  • ollj oh
    ollj oh 2 months ago

    personally i primarily (80% of buy decision) is cuda core count divided by cost, because raytracing relies more on cuda cores than on the noise-filtering via tensor-cores (which is pretty much linear to cuda core count) than on memory size+speed or anything else (that pretty much just scales with target display resolution). memory transfer is bottlenecked anyways.
    so far, only the "marble demo" needs more than 10 gb vRam (of the cheaper 3080 card) to run smoothly (even in small screen resolutions), because that one has a LOT of detail and barely any occluders (besides a static camera angle). This is still the only game, where i missed the 2 gb of extra vram of the 12 gb 3080 variant, but i did chose "more cuda cores per money" instead,, saving money on having less vram.

  • teamhex
    teamhex 2 months ago +1

    I'll stay with my 2070 super. It works fine and I got it for MSRP before the madness began. The fact that even after Ethereum going POS we're still seeing ridiculous prices...I just can't bring myself to be passionate about this anymore.

  • Tadge- Rico
    Tadge- Rico Month ago

    Love the clarity of the explanation behind costs and of the specs. Clear and concise. Also, I believe everything you have said is correct. No one thing or the other, just degrees. But unfortunately Jay, I don't really see the scenario improving any. With only two players in the market, it is more like a seesaw effect. It just depends on who is trying to "pull the wool" at any given time.

    • Ionstorm
      Ionstorm Month ago

      Just like laptop GPUs you can have a 3070 100w tdp for example and then one at 200w which perform completely different (honestly they should be 3070M 100 etc).
      People actually think their mobile 3070 would be as fast as a desktop.. I don't know how people expect them to know one 3070 from another without looking at tdps.
      The whole Gaming Industry is just turning in to a utter scam..... this makes the whole "RBG" era look legit.

  • Zetimenvec
    Zetimenvec 2 months ago

    The last card I purchased was a 1080 GV-N1080G1 before the crypto balogny sky rocketed prices. It's been suitable since 2016, and while I am eying new parts for a potential new build, I'm not particularly interested in the nvidia cards anymore, despite having purchased them pretty uniformly throughout my PC building career. This sort of behavior makes me even less interested in their cards.

  • David Jones
    David Jones 2 months ago +257

    I've waited 3 years, I can wait until November. Competition keeps people honest, I just hope AMD can put enough pressure on Nvidia to start to right things.

    •  2 months ago

      @s7robe so ass? My 6800XT runs right along, and even beats the 3080ti in some instances, and guess what? Not for the 3080ti price either, and I have 16GB of Ram.

    • Nick Currie
      Nick Currie 2 months ago

      Likewise - Im really not a fan of how nvidia have behaved towards gamers in the last few years - my next card will most likely be the new AMD 7000 series

    • s7robe
      s7robe 2 months ago

      AMD GPUs are so ass unfortunately. Idk if them releasing more mid will do anything for competition, which really blows…

    • Jwalker
      Jwalker 2 months ago

      @Grapjas you are right. I found this out last year when my Ryzen 1700x failed. I decided to buy a new CPU + Mobo. This was before 12th gen was released. The price difference between a 5600x and a 11400f was 75 aud (about 50 USD). AMD only released the 5600, 5700x after 12th gen started getting competitive. And the there is the 5500 and lower CPU models that are terrible products that should have never been named as if their better than the 3000 series CPUs.

    • FoxDog
      FoxDog 2 months ago

      @Aragonshow lol all asians aren't related

  • Ryuofm UofM
    Ryuofm UofM 2 months ago

    Curious to see how these sell in 2023 if the Economy keeps going South and PC buying continues to struggle. My 3080ti will be fine for a while.

  • Unclepungus
    Unclepungus 2 months ago

    I can tell you rn the way Nvidia is going I'm holding my 2060 until intel can get their cards working functionally and on the market and buying intel. I'm honestly sick of Nvidia charging these astronomical prices and and mischievous bullcrap. Reason I'm willing to go intel, get another party in the box eventually to hopefully giveway so Nvidia can finally get a run for their money


    Here are my thoughts: I prefer to keep my electricity bill low. This means, that I have a hard cap of 120W on the power consumption. Currently I am using a 1050 Ti 4 GB that requires no additional power input. It runs my games at an acceptable level in Full HD. I DID look into the entry level cards in tot 20 and 30 series, but the 2060 uses 160W of power, and the 3050 uses 130W. So I am looking forward to the 4000 series to see if they manage to release a card that is within my power budget. Just to be clear: This 120W is the original power consumption of the 960 -1660 series, so I really think the current 50 series should have to fit under that.If it does not, then I will buy a card from AMD once I have no driver support for my current one.

  • Stelios Tsarsitalidis
    Stelios Tsarsitalidis 2 months ago

    Dude, even the 3080 12G is actually different from the 10G version. The 12G has more cores, and is based off of the same chip as the 3080 Ti and above. They pulled the same trick here, only even more extremely. As much as I like many technologies NVidia makes, I hate A LOT of things about them, this naming issue is the least of them. The problem is that still, AMD and Intel have not put as much competition as they should, so far.

  • Garret Hopwood
    Garret Hopwood 2 months ago +325

    I am one in the camp of "wait til November" to see what AMD brings out. Hoping it's good enough to go full team red.

    • The Rage King
      The Rage King Month ago

      @Kevin B Ok 😄. I wasn't even mad lol, I'm just so used to talking shit to people on here. This is just how I write with this account.

    • Poliack
      Poliack Month ago

      @Kayak Fan "AMD's drivers still bad" bad for who? I am using AMD gpus for the last 7 years and only once I had driver issue back when I had R9 390 in 2015. Regular user will have no problems whatsoever.

    • Kevin B
      Kevin B Month ago

      @The Rage King calm down.

    • Thewaterspirit57
      Thewaterspirit57 Month ago

      @Kayak Fan And I still stand by that to this day. I thought the RX 6400 would be a good card as a temporary thing till I got a "better" Ray tracing card... but nope. Not only are the drivers bad, but also the RT cores basically do nothing good, because of how little there are. 20 cores is like the limit to how many ray tracing cores any GPU should have. Any lower, and you're basically wasting money.

    • Kayak Fan
      Kayak Fan Month ago +1

      @Zombie Slayer As an end-user, maybe. But as a programmer that doesn't really help me unfortunately.

  • FriedFx
    FriedFx 2 months ago

    The latest nvidia announcement has gotten me to start looking at AMD options. That cuda core count for the same 4080 card is just crazy. The 12gb is a 4070. Still using a 1080ti FE card because i refuse to buy overpriced cards. I think nvidia is trying to screw people over who aren't very smart. Not cool.

  • lern2swim
    lern2swim 2 months ago

    The entire approach to naming conventions of gpus has always been intended to obfuscate things.

  • Joshua Bergin
    Joshua Bergin 2 months ago

    Looking at the specifications of Nvidia's new lineup, in terms of its performance, are there any games currently on the market or coming out later that make use of or require that kind of hardware? I currently use a Radeon RX 580, and have been contemplating purchasing a new card. So far I have ran everything fine on what I have, but after seeing the gameplay reveal for the upcoming Starfield, I fear I may need a bump. I will say, the cost of these new Nvidia cards is wild, and I can't see myself spending over a grand on something I may not ever make full use of, haha.

    • Daniel Mitra
      Daniel Mitra Month ago

      I think it mostly comes down to resolution these days. There's a plethora of GPUs that may be older but are still fantastic for full HD gameplay. Once you bump up the resolution however, things go much more expensive. I find the 1060 6GB to still be reasonable because I'm not planning on getting a larger screen - if you are, be on a lookout for recommended cards for specific resolutions.

  • Athalon304
    Athalon304 Month ago

    I waffled back and forth for many years before finally planting roots in the AMD/Radeon camp about 10 years ago. From the "outside" looking in, nVidia has looked more and more sketchy with every year and I can't ever see myself going back. My AMD cards have been more reliable, better value, and they just work for years and years. My work-issued computers have all had current-gen nVidia cards and haven't just been problematic, they seem like robbery when comparing performance and price against my personal AMD card. Maybe this stunt from nVidia can bring people to the Red side and boost competition in the market, who knows? If they stopped stuff like this and lost some of the market, that would be great for all of us in the long run.

  • playascap
    playascap 2 months ago +262

    Jay, you seem burnt out from disappointment after disappointment from the industry, nonetheless I want to thank you for continuing to put out videos with integrity that you believe will help your viewers. Thank you!

    • Jingyang Lin
      Jingyang Lin 2 months ago

      ​@VesperAegis News & Games it was not marginal. Anyway I am not supporting price gouging. I will just get a cheap laptop to play cheap game will do. I am willing to wait as much as 20 years till this price gouging nonsense is gone before i start having DIY PC as a hobby back.
      Good luck to NV once FED hike rate to 5% and everyone go into recession.

    • VesperAegis News & Games
      VesperAegis News & Games 2 months ago

      @Jingyang Lin The 30-series was a marginal move above the 20-series in terms of performance in comparison. The only modern comparison we have in terms of how big this upgrade will be, assuming the specs are accurate, was the 10-series. Predictably the crowd also complained to the high heavens when the 30-series was released that it was "too soon" after the 20-series Ti and that the price wasn't justified due to performance, even though that was a much smaller leap according to the numbers. There's a huge power draw on this one in particular for a good reason. $1600 is 100% reasonable at present given inflation and performance combined.

    • Jingyang Lin
      Jingyang Lin 2 months ago

      @VesperAegis News & Games what rubbish, the generational leap have been doubling every generation, RTX 40 is no exception, please dont make it sound like one.
      4090 only deserve a price of $999, nothing more. For 4080 12GB, it should have been $399.

    • Girard
      Girard 2 months ago

      It must be so sad for him to have all tha hardware fo free.

    • VesperAegis News & Games
      VesperAegis News & Games 2 months ago +3

      @BigNiggaBalls The same could be said about how many people suddenly kiss the ring of AMD the very second a certain price point is reached, despite performance. Typically someone can only be an obsequious fanboy if another product actually is better and you remain a fanboy.

  • Chris Liddiard
    Chris Liddiard Month ago

    3 weeks later and Nvidia has 'unlaunched' the 4080 12gb - you called it, but it's still a win for Nvidia, as those gamers who were prepared to shell out for the cheaper yet overpriced card, will now go up a tire to the 4080 16gb and spend more doing so.

  • JT Money
    JT Money Month ago

    Great job with the video. I've been an Nvidia fan for over a decade now, but when the spokesperson (couldn't care less his name) said point blank that the prices were what they were and were not going down and customers were just going to have to suck it up and get over it.... That was a very bad look for Nvidia. F"ck that guy, f*ck that company. That's another example of how, as you said, Nvidia has failed to be customer centric: people are not made of money. If they can't figure out how to make products that A) people will want to buy, and B) people can afford to buy ... Well, let's just say I don't think I speak only for myself when I say that my money speak for me, cause I'm not going to support a company that cares so blatantly much for the profit margin and so absolutely little for their customers.

  • itwashear
    itwashear 2 months ago

    I have only used nvida cards since forever (~2005) and i have been less and less impressed by all the crap they have pulled in the last few years as of a few months ago i already decided to avoid them if at all possible at this point i don't really care if i get slightly less power per dollar until they do a 180 im sticking with amd I already put my first amd processor in my build 2 years ago might as well go all the way on the next one

  • Jackson J
    Jackson J Month ago +1

    I've been using EVGA cards in every PC I've made since my first build. Guess I'm switching to team red, because nvidia is pissing off everyone and they're really shafting their customers at every turn.

  • Benjamin Wan
    Benjamin Wan 2 months ago +91

    Which means it went from 499USD for the 3070 to 899USD for the "4070" and the 3080 went from 699USD to 1199UD for the 4080. Amazing inflation in prices. Stunning. Lets hope team Red releases more affordable cards.

    • Raving Crab
      Raving Crab 2 months ago

      @Hjelpen crypto miners are not humans

    • Robot Spartan
      Robot Spartan 2 months ago +4

      @banescar bad counter, as performance is ALWAYS higher generation to generation
      now, if the performance increase is double what it usually is (a 3060=2070=1080 roughly, so in this case a 4060=3080), then its more of a name shift which is a different conversation altogether. But that stands to be seen
      Short version: if a 4060= 3070 its greed from nvidia. If a 4060=3080 its a badly implemented shift in SKU naming.

    • Fiona Sherleen
      Fiona Sherleen 2 months ago

      @Hjelpen there were miners

    • banescar
      banescar 2 months ago

      Not exactly, because performance is also higher.

    • Hjelpen
      Hjelpen 2 months ago +4

      @scproinc exactly, all that happend is nvidia understood from the last years people are willing to pay the enourmus scalping prices for gpus, so why not just increase it themself to get maximum profit when they get sold it instantly anyway

  • Fernando Cordero
    Fernando Cordero Month ago

    Well said, Glad you are speaking up and letting the consumers know the truth.

  • megapet777
    megapet777 Month ago

    I have 3070 and I'm happy with it. Will be using for a long time. I just don't like what nvidia's management is doing even though their product is great. Hopefully in 2-3 years amd or intel will beat nvidia's ass. They kinda deserve it.

  • Daniel Vitorino
    Daniel Vitorino Month ago

    Fully agree on the naming thing, I mean, like you explained, there even is a plausible reason for prices going up, but why be shady with the 4080 12gb, thats just... yeah its just a scheme to have ppl pay xx80 price for a xx70 card, anyone with just the basic understanding of how the cards line up (or are supposed to line up) name/performance wise may very well fall for this.

  • Cort Phelps
    Cort Phelps Month ago

    Tell it like it is man! I hope AMD stick it to Nvidia. I'm a believer in good things comes to those who wait. My RTX 2080 is still a great card. And if AMD can give me more bang for the buck. I'm gonna jump the Nvidia ship.

  • Karsten Rechnagel
    Karsten Rechnagel 2 months ago +688

    An even bigger difference between the 12GB and 16 GB versions of the 4080 besides memory and core count is the bandwidth, only 192bit wide vs 256bit wide and even 21Gbps modules vs 23 Gbps modules, so sad they call the 12GB version a 4080 :(

    • kimnice
      kimnice 2 months ago

      @David Olszak
      On those specs that you mentioned: 4080-cards have 8 or 10 times more L2 cache than 3090 ti. Shouldn't this mean that 4080-GPU has to make fewer calls to VRAM and thus need less memory bandwidth?
      I mean in the past this is exactly what happened when Nvidia released Maxwell: GTX 760 had over 70% more memory bandwidth than GTX 960, but still lost. GTX 780 ti had over 30% more bandwidth than GTX 980 ti, but still lost.
      There are plenty of valid reasons to hate this launch, but this isn't one.

    • Sviatoslav Damaschin
      Sviatoslav Damaschin 2 months ago

      That was to be expected, the 12gb is missing 2 ram chips and each chip has 32bit bus to it, so 256-192=64bit
      The bus is garbage per se because they went with high capacity cheepo 2gb modules and still increased the price :)

    • Venox
      Venox 2 months ago

      @David Olszak Well, you'd be surprised how many rich people just buy the top of the line stuff with not even paying attention to what it's called yet alone reading specs.

    •  Not AN AMD Nor Nvidia nor Intel fan
      Not AN AMD Nor Nvidia nor Intel fan 2 months ago

      cuz its not a 12GB 4080, we all know the 4080 12Gb is the 4070 12GB

    • Andre Messado
      Andre Messado 2 months ago

      @David Olszak I think there's enough rich people who don't know what they're doing that it'll even out and/or that they'll actually end up making as much as they are expecting too.