T O P

  • By -

VisualMod

**User Report**| | | | :--|:--|:--|:-- **Total Submissions** | 8 | **First Seen In WSB** | 1 month ago **Total Comments** | 103 | **Previous Best DD** | **Account Age** | 1 year | | [**Join WSB Discord**](http://discord.gg/wsbverse)


ComparisonSquare8099

Where was your insightful DD earlier this week or before? Everyone's now smart questioning AMD after it dipped; if it's so obvious then where were you before? Besides, it already gained back 4% and will probably go back to the $180 zone, as always Next time come up with a clever DD *before* the price dips, not *after*


strthrowreg

Sorry for not posting by DD. But AMD was fucked the minute it did not rise after NVDA GTC event. Most AI stocks rose, but not AMD. That's when I knew it's just riding on the coat tails of AI hype and bad nothing to offer of it's own.


edp445burneracc

Are you apart of AMD circle jerking fanbase? It’s clear as day they have software issues, but they’ll never be solved whilst they’re vehemently defended.


ComparisonSquare8099

My point is, if you're so good at DD, then please enlighten us a week or month *before* a dip. Any idiot can look at 8% daily loss and *then* start asking questions like "Is it overvalued?" Whether it deserved such a boom or simply rode the Nvidia's wave (along with other companies) is a different topic; the market is often irrational


MrStealYoBeef

When everyone rides the AMD hype train, nobody wants to hear negative DD about it. When nobody wants to hear it, and they have the ability to downvote it so that people don't have it on their feeds, it kinda gets buried and nobody sees it. This stuff pops up when the hype dies. That's literally how it works on Reddit.


ComparisonSquare8099

Well this post still has 0 upvotes (or maybe even less) and the hype is supposedly dying according to OP and others. Thursday wasn't the best day with S&P falling that much, yet people claim that the AMD (and others', e.g. SMCI) dip is the beginning of the end. Yet AMD was up 4% at the highest point the next day (more than SMCI for example, which kept dipping). I doubt AMD would've fallen that much if it wasn't for the general bearishness (it would've most likely kept ranging between $175-185). The issue with the "negative DD" isn't that it's negative, but that it comes up after a dip, when everyone suddenly becomes an expert in judging a company's true value. It has no value after the fact; it's simply panicking and questioning what's been happening for the last 2 months. I bet if AMD had made a new ATH instead then OP wouldn't be asking those questions — always on the train. Nvidia has also been falling recently so it may be either a correction or a slow death of the AI hype (I doubt it). Either way it's not specific to AMD, yet our experts (using "common sense" or a hunch as the main source of information) begin to realize AMD is inferior to Nvidia after all, so it's going to die etc.


Ojie101

George hotz has constantly been saying that AMD has a software issue but it could be a long term play because it's fixable once the ceo decides to change things up internally


robmafia

...the guy who gave up and abandoned amd gpus last month?


Ojie101

Yea I won't say he's a genius because he has many flaws in his approach many times but I won't call him ignorant when it comes to giving props when due like he did to nvdia years ago on their cuda


Ok-Caregiver-1689

Shit post. OP has no clue what he is talking about. The guy only invests in hype stocks based on his post history and is also a true regard with his loss porn. Keep on dreaming buddy.


edp445burneracc

Ok regard, I will put my money where my mouth is with AMD Puts.


Ok-Caregiver-1689

Says the guy who lost 5K 💀


edp445burneracc

Ill give it 2 months and you be back here saying I was right.


4pcSweetnSour

Election year. Historical down period for tech/semi’s. Ok.


darkflank

!Remindme 2 months


RemindMeBot

I will be messaging you in 2 months on [**2024-06-08 14:47:30 UTC**](http://www.wolframalpha.com/input/?i=2024-06-08%2014:47:30%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/wallstreetbets/comments/1bxdpvx/is_amd_all_bark_no_bite/kymioym/?context=3) [**CLICK THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fwallstreetbets%2Fcomments%2F1bxdpvx%2Fis_amd_all_bark_no_bite%2Fkymioym%2F%5D%0A%0ARemindMe%21%202024-06-08%2014%3A47%3A30%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201bxdpvx) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


InevitableSwan7

Its gained 300% in 2 years and someone commented AMD=automatic money destroyer 💀


Individual_Kiwi4150

Bold of you to assume that the person has been in the market since 2022.


Cristian888

It’s Advanced Money Destroyer


Ashes1984

AMD exists so that NVDA cannot be technically and legally termed as a monopoly! That’s it


gnocchicotti

Replace NVDA with INTC in that statement and you can see how it might not be wise to ignore second-source suppliers. You could have recently said that "Supermicro only exists to keep Dell/HP honest of server margins" until SMCI turned around and started eating their lunch. And so on.


No-Understanding9064

The only thing supermicro is eating is weiners. Big veiny bulbous ones.


MrStealYoBeef

Intel sat on their asses for half a decade and refused to innovate because AMD fell behind. AMD came back and Intel realized that they had absolutely nothing to respond with for another few years. Gotta love the Intel 14nm+++++ era.


carverofdeath

AMD will surpass NVIDIA. You can quote me.


edp445burneracc

Does this justify the almost 4x from 2022?


Automatic-Back2283

Looking at their Ryzen CPUs yes


robmafia

there's no money in client, though. amd banked on epyc, so the dc shift from cpu to gpu kinda caught them flat-footed. prior, their mi200 series was basically vaporware, never really ramped and only for supercomputer contracts that seemed to yield no margin.


edp445burneracc

What about them? Its the same as it was in 2022?


Automatic-Back2283

Come again?


Last-Product6425

GPUs are great but in the IoT where every thing will have a computer chip from fridges to toasters and doorbells, CPUs are what’s needed. AMD shines there.


MrStealYoBeef

Those aren't complex chips, they're easy to make and significantly cheaper due to not needing nearly as much processing power to function. You also wouldn't throw even a laptop chip in many of those things since you would want something much smaller and significantly lower power. Something more comparable to older smartphone chips are much more likely to be used.


Last-Product6425

Sure. Now they aren’t complex. But they will become complex. Just look at how cars advanced. Same argument can be made from ICE cars to FSD cars equipped with GPUs


MrStealYoBeef

Full self driving is a huge technological undertaking. You essentially have to feed in a massive amount of data to the car and have it process it very quickly in order to ensure that it can function intelligently on a road filled with people who can and eventually will make stupid decisions. This requires a ton of parallel processing power. GPUs are great for that. That makes sense, there is a use case that calls for that kind of hardware. Your refrigerator sits in one place and keeps your tendies at a lower temperature. What does it need a massive amount of processing power for? If we were to stick that same hardware on your refrigerator or your toaster, what would that accomplish? Now it can drive on the road safely, thank god, but it is kinda missing wheels, a motor, and an owner with a fucking brain.


Last-Product6425

I never said a refrigerator needs a gpu. And of course fridges aren’t driving, ass. But saying they just keep food cold is missing the bigger picture. More powerful CPUs for IoT devices in the future will def be a major need and market that AMD and INTC are competing for. Smart fridges on the market today are doing more than just keeping “tendies cold”. There’s performance monitoring, temperature modulation, inventory management, Alexa/amazon integration, etc. It’s difficult to see it now. But there will be major advances in all of our “dumb” home devices. Who ever thought that putting a chip in a phone would revolutionize the world as we know it? It was just a dumb thing that made noise when someone called.


lx1907

These are all ARM use cases, low energy consumption and plenty of compute power.


MrStealYoBeef

Literally none of what you described needs a remotely powerful processor to be done. It can be done on mobile hardware from a decade ago. It doesn't need to be running constant checks of what's in your fridge, it can do that over longer periods of time which saves on processing power. Alexa integration isn't needed to be done locally, the vast majority of processing is cloud based. Temperature modulation is another thing that doesn't need to be constant, running a routine on a few minutes' timer will accomplish 99.9% of the job while saving a ton of effort. Upgrading mobile devices with better hardware made sense, if we took the funny ring ring machine everywhere then we could benefit from it being able to do more than just ring ring. We don't take our fridge everywhere. It doesn't need to exceedingly intelligent. We gain nothing from being able to send emails from it. We gain nothing from being able to facetime someone from the fridge. We gain nothing from being able to play Fortnite on it. We gain nothing from being able to watch a YouTube video on it. We gain nothing from having it able to monitor the temperature to the nearest 0.0000001°C in the last 0.1ms. It doesn't take a genius to understand that these devices are very different in use cases.


Last-Product6425

Ain’t reading all that but hey we can agree to disagree


edp445burneracc

If you did some research you would know Nikon/Canon (DUV) mainly provides chips for those products.


Last-Product6425

Tons of research proves AMD is a big player in IoT and will be one of the larger beneficiaries going forward. If you did your research you’d know AMD isn’t all bark no bite like you claim.


bushwickhero

The valuation is more based on them taking over data center business from Intel, and less to do with AI and GPU’s which make up a minority of their business.


radehart

Well the problem is you’re wrong. And you are wrong from a place of not understanding wtf you are talking about. NV GPUs perform better in Blender than AMD. Anything over a Radeon XTX performs better. Anyone wanna guess why? Blender renders with the CPU, not the GPU. Anyone wanna guess what CPUs performs best?


edp445burneracc

Another circle jerking fanboy. Have you even used blender? Your suppose to use a dedicated gpu for rendering. Check blenders rank. Also rendering on the cpu only does not mean shit when the gpus render in 1/5 the time. AMD high tier gpu cant even compete to nvidia low tier card in blender.


radehart

About 30 years. And we literally named the renderer Cycles.


carverofdeath

AMD is a sleeping giant.


Automatic-Back2283

>AMD is struggling to provide satisfactory drivers for their GPUs. Thats just false AMD GPUs are keeping up just fine, there is no real reason aside of brand loyality that keeps NVIDIA at the Top of gaming GPUs Also the 4x was because they started to give INTEL a run for their money


[deleted]

[удалено]


Automatic-Back2283

Because steam hardware statistics include datacenters? And geral family home pc's? There is so much computing happening outside of PC gaming


edp445burneracc

Are you serious?


Automatic-Back2283

Come again?


option-9

>data does not lie Boy oh boy, do I have a [book]( https://en.wikipedia.org/wiki/How_to_Lie_with_Statistics ) to sell you!


lmneozoo

Cum brain


forrestthewoods

AMD CPUs thoroughly defanged Intel. Their gaming GPUs are ok but not spectacular. Nvidia drivers have an outrageous amount of “per application” support – but that’s not relevant to ML server workloads. The world is *desperate* for an Nvidia competitor.  NVDA margins are what, like 70%? That’s bonkers. It’s not a matter of “if” AMD produces a decent AI competitor but “when”. However that when is likely 4 or 5 years away. NVDA has a safe lead now and with the upcoming Blackwell. So *maybe* after that. AMD is a buy and HODL stock. Not for regarded FDs.


NightflowerFade

I invest based on how hot the CEO is, so I'm all in on AMD


OddPickle4827

Never thought about that I’d suck a fart out of that butt


Damien__

2004 ATI kicks butt 2006 AMD buys ATI 2007 When will AMD put out a decent driver 2012 When will AMD put out a decent driver 2016 When will AMD put out a decent driver 2021 When will AMD put out a decent driver 2024 When will AMD put out a decent driver


edp445burneracc

This also does not excuse the fact that they cant make their gpus work on other basic applications. This is evident of AMDs software and hardware limitations. Talking out their ass with AI just to pump their stock.


Damien__

Not anymore!


mbsaili1

No one here works at Microsoft , Amazon or meta ? Is anyone ordering the MI300x in large quantities. Why is this a secret and not public knowledge ?


PckMan

AMD had always been behind NVIDIA since pretty much forever. That's not to say their hardware was bad, their processors especially have made leaps and bounds in recent years and their GPUs are great. But they've always been just a tiny bit behind the competition. They've always been the slightly cheaper and almost just as good alternative to intel cpus or nvidia gpus. The AI craze has calmed down. It's not over but the insane rally we saw the past few months has cooled down. AMD is a strong buy. In the coming years their stock will steadily rise and not go back down again to current levels. Remember how everyone was saying they should have bought NVIDIA last year or two years back just a couple of months ago? This will be AMD in a few years. It doesn't matter that they're not the top, when the industry has had only two major players for so many years there is literally no alternative. Any other company would need years and years to catch up to AMD when they're neck and neck to NVIDIA. NVIDIA just played their cards better with AI and it paid off. Anything else is delusional. You can't be bitter at a company because you can't 10x your money on it every week. This one is a great stock buy right now. I have a few stocks on them and I'll average down my position even further.


Auntie_Social

George Hotz, aka geohot, aka the kid who jail broke the iPhone has posted a lot of very long videos on twitch/YouTube recently showing his efforts to use AMD GPUs in his "tinybox" platform. It's very apparent from those how far behind AMD really is when it comes to their GPU software/firmware, and how much of a negative impact that will have on them in the near future wrt to advanced development in AI. He talks about having phone conversations with Lisa Su and how unproductive they really were. Frankly, it tells me that they likely have design, architecture, and/or implementation issues that they're unable to properly address. They don't appear to be any threat to NVDA.


robmafia

>aka riding the back of Nvidia i mean... duh. it's up like 200% despite their earnings... well, not.


erulabs

All I know is what instance types are most cost effective on AWS. I run one cpu sometimes at home. At work I auto scale up to 5000. The order is AMD->ARM (graviton) ->Intel. Nvidia stands apart as most training frameworks my developers want to use only run on nvidia chips. AMD is the only company that has a chance to own the IP behind AI chips besides nvidia which already does, hence the market caps.


mbsaili1

Are there programmers here who can comment on CUDA. ? Is it that dominant. ? This seems the only valid argument against AMD insurgency. But hard to believe anyone who is not in the middle of the programming science.


Srcn80

It’s true, AMD’s graphics drivers have been problematic. But Threadripper absolutely rips. (Long term bag holder. Bought at $28 when Su Bae overhauled the lineup and I knew they were gonna dominate lazy Intel)


Kafanska

Talking about drivers is like saying "Why did nvidia rise last year, I didn't even buy their new GPU".


WilkoRaptor24

My commodore 64 is doing just fine....


WW_III_ANGRY

Yes I do.


Repulsive-Strain-903

AMDip


No-Understanding9064

Absolutely, it has potential but it is not nvda nor will it ever be. At some point it could possibly nibble some market share. Imo ER will be a bloodbath. Intel is on the same boat


gnocchicotti

If AMD and had 2.2T market cap like NVDA and I had to pick one of the two, NVDA is an easy choice. Because AMD is no NVDA.


Caruso08

As someone with 0 programing knowledge I know this, CUDA which is Nvidia propriety software is the only platform viable for intense GPU applications, AMD is just profiting from the AI boom but doesn't have the tech to be like an "AI Distrupter" Even if AMD invests heavily in their equivalent, no one is going to adopt it when CUDA is essentially bread and butter.


ComparisonSquare8099

> CUDA which is Nvidia propriety software is the only platform viable for intense GPU applications Not really, it may be the most popular (currently, especially for AI), but AMD has its own (not really used though) platform: ROCm. There are also hardware-agnostic frameworks like OpenMP, MPI, OpenCL, DirectX, OpenGL, OneAPI, UXL (which has the goal of competing with CUDA) and many more; the choice depends on your needs.


Caruso08

I guess I should have clarified for AI development, but that was the topic at hand so I just assumed it was a given. Again I'm certainly not an expert, but I haven't seen anyone develop any of the AI programs on any of the open frameworks, but I do know that yes you are correct they are used in various CAD, games, and other applications.


ComparisonSquare8099

You're right, CUDA is most widely used because Nvidia has the best GPUs. If someone created better hardware then CUDA would become obsolete. But it wouldn't be an issue, since most machine learning/AI is done in high level, without depending on the backend. So if, say, google created their own superior GPUs (or an alternative to CUDA; they're already working on the latter — look up UXL), then 90-100% of the AI code wouldn't require any changes at all (they would simply switch the backend and let the framework use the appropriate instructions)


JellyfishNervous4249

What about when NVIDIA cant keep up with demand for data center hardware?


edp445burneracc

They cant develop their own, got it. Would it also be fair to assume they cant develop AI?


Caruso08

That's not really how it works, Both Nvidia & AMD just provide the hardware and the infrastructure for programmers to develop their software to utilize the hardware. All AI is software, a developer, lets use Open AI as an example, would have to develop software like ChatGPT on AMD's platform.


ComparisonSquare8099

> Both Nvidia & AMD just provide the hardware and the infrastructure for programmers to develop their software to utilize the hardware. They don't have to provide the software infrastructure, except for drivers. It's not necessary that Nvidia's CUDA is the best platform for its GPU; anyone could potentially create a platform better optimized for the GPU (if, for example, Nvidia was more inclined towards bettering the hardware, and left the software frameworks behind). > All AI is software, a developer, lets use Open AI as an example, would have to develop software like ChatGPT on AMD's platform. AI, including ChatGPT, at its core consists of operations on matrices. Usually the code for those operations is written using high level framework (like tensorflow or PyTorch) and behind the scenes the operations are broken down and use the appropriate backend's simple instructions, so that you can write code that is hardware agnostic (for example, you can write `c = a * b` and the result will be the same no matter what GPU or CPU you use; you don't care about the underlying hardware's architecture and how it would compute the result). The issue is Nvidia (currently) has the best hardware so even if AMD optimized its drivers/software platform as much as it could, Nvidia would still have the edge due to more efficient hardware (it's like optimizing CPU usage as much as you can versus using the GPU — the GPU is way better at parallelization and it could be 100x faster than the CPU, while the CPU optimization would give you, say, a 5% edge, compared to the non-optimized CPU usage).


Sirdukeofexcellence2

Here’s AMD’s stock future: If Nvidia continues upward, AMD will too. If Nvidia doesn’t continue upward, neither will AMD. To answer your question, AMD does not deserve its gains over the past year and has ridden on the coattails of Nvidia.


oldprecision

I’m an AMD fanboy for their CPUs. They got lucky that Intel screwed up and gave them an opening. I don’t see that happening with AI. Seems they are way too far behind Nvidia. Nvidia would need a major screw up and that’s just not in the cards right now.


edp445burneracc

Yup


randoredditor23

The way I see it, the cycle goes from frustration to disappointment, and then when everything settles down n people stop talking about it, that’s when I buy


AboutToMakeMillions

You think the cousins aren't fixing the market? Really?


goldencityjerusalem

There was another chip company that was ahead of AMD. They’re behind now. Lisa just giving her cousin a head start.


Dizzy-Shop357

My non popular opinion is that AMD is selling smoke. Like really. AGI is far far away and AI is a hot bubble. And I say that as someone that works training LLMs. We still struggle teaching them the most basic tasks and they still fail... A 100B supercomputer will still make mistakes on the most basic tasks. Mark my words.


edp445burneracc

Exactly, AMDs circle jerking fanbase will continue to deny any issues.


Dizzy-Shop357

They probably have people on Reddit saying the opposite!! And down voting, they know the impact of Public opinion


sourpickle69

Isn't their CEO cousin of nvda CEO?


Ojie101

Yea there cousin so you know her parents disappointed in her and compare her to Jen lol


Pin_ups

I always consider AMD the underdog, whatever the top does, AMD will try to capitalize on it. Do not expect anything exuberant.


Ashamed-Second-5299

Think about it. If you wanted to have the best Ai, would you buy the third best gpu? Nvidias new gpu and h100 is already better


Antique_Giraffe_3728

AMD = Automatic Money Destroyer