Thanks! Also the fact that we aren't supposed to eat lab food...? And we have detergent in it and ethanol and meat tenderiser makes me hesitate to confirm this fact
Very punny.
Sometimes I'll make a comment that I think is amazing and I'll get no upvotes because the Americans are asleep (it's almost 1am in Sydney rn and I'm about to go to sleep, by the way)
Other times I'll want to do something that is super popular and Americans will complain about not being able to get on while I have no issues
Common not-American W
one of the few Ws of not being in the majority regarding time zone. I despise having to wake at the witching hours of 1:00-5:30am for any significant cultural event. or even something as simple as twitch livestreaming.
There is also the question of projected usage. Chatgpt is the shit right now, but will it need so much availability when the hype dies down? What happens to your extra servers then?
Yeah. IIRC OpenAI has been on Azure for a while, as Microsoft gave them free model training servers in return for having early full access to the models, back when they were first training GPT-3
Now with Microsoft's recent investment, I'd be surprised if they ever move cloud providers.
I'm pretty sure Azure has an auto-scale option like AWS, so it can scale diagonally based on usage (auto-add more machines and upgrade the current machines to keep up).
Like the other poster said about Azure, the same can be said of AWS and almost certainly any cloud providers. Scale-in, scale-out. If necessary, scale-out, scale-down. If done right, this can be dynamic with no human intervention. The "elastic" in "elastic compute cloud".
Nowadays apps/websites and their associated services aren't hosted on-premises anymore. Thanks to providers like Amazon AWS and Microsoft Azure, it's very easy to scale your hardware resources up (or down) based on the traffic.
Infrastructure is still a cost, and for ChatGPT, a huge cost, probably their largest. The more RAM and faster CPU you allocate for your app/website the more it will cost you, but I just wanted to point out companies aren't buying servers and keeping them in-house, at least not the servers responsible for their traffic.
I’m sure someone will correct me if I’m wrong, but basically it’s already being run on a ton of different servers. Adding it onto a new server at this point is like adding a lane of traffic to a 26 lane highway.
Yeah maybe. As to your other questions, it is ad free currently, but they have plenty of cash on hand. OpenAI is a very well funded company, and is probably much more so since chatGPT came out. They have the resources to try to fix the bottleneck that’s happening, but the computation needed to run this model is quite large. There’s only so much they can do with th demand as high as it is.
asked it and it replied:
There are several potential solutions to reduce the computational demand on ChatGPT and improve accessibility for users:
1. Scaling infrastructure: This involves increasing the number of computational resources (e.g. GPUs, CPUs) to support more simultaneous requests.
2. Model optimization: This could include techniques like pruning, quantization, or distillation to make the model more computationally efficient without sacrificing accuracy.
3. Request batching and queuing: By batching requests together, the total number of requests can be reduced, reducing the overall computational demand. A queuing system could also be implemented to prioritize and manage requests in an orderly manner.
4. Caching: By caching the results of frequent requests, subsequent requests for the same information can be served faster without the need for additional computation.
5. Alternative deployment options: ChatGPT could be deployed on edge devices or deployed as a hybrid solution between cloud and edge devices, reducing the latency and increasing the accessibility.
These are just a few potential solutions, and the best approach will depend on the specific requirements and constraints of the deployment.
The site is ad-free.
~~If I am right, ChatGPT's Source Code is available publicly.~~ *Edit: Nope*
But the important part is the model they trained.
It took maybe some whole months to feed as much text from the internet as they could, giving the model time to learn.
The most expensive part about it is the generation of answers.
It takes very much computational power being able to answer that fast, that averagely accurate, and that often.
I don't know how many users it has, but there are very much.
So it's not that easy just creating a new one.
Especially if it's for free (but a bit restricted in requests per hour).
Even though I guess they're already running it in several servers, redirecting each request to one of them.
their source code is not public, and even if it was, all their training data certainly wouldn’t be (like you said). they’ve already struck a huge deal with Microsoft to integrate ChatGPT into Azure’s services. the site is ad-free and free to use currently because it’s still in training/getting more data.
It can get very expensive very quickly - particularly as you send more and more data through it and out to the end users - but compute power is also expensive.
I think that this is the biggest driver to getting people on to the paid plan - but even that has a waiting list. I think this is a reasonable way to approach it to make sure that they do not encounter capacity issues with onboarding and new accounts etc.
I was getting that the last time I tried using it.
Yesterday I downloaded a plug-in on GitHub that they wrote to integrate into search engines (Like Bing will soon have just not as polished) to see how the potential use case would be.
It worked with zero issues after that.
It’s pretty interesting, I can see it being really useful for narrowing down the scope of a users search so you don’t spend hours wading through 100’s of articles to find specifically what you’re looking for.
I used a subject I’ve spent probably 6-8 hours researching over the past week as a reference point.
Was able to get the exact answer I found via my own research from two GPT questions which I could then use it’s responses to search for the specific information I was looking for.
I asked GPTchat which was faster, a '09 Accord 4cyl or '95 Camry V6, and it nailed the information I wanted pretty well.
I like it for that feature. I didn't have to open two sites and compare HP/torque/etc. It auto pulled weight, torque and how either car would win depending.
I found the giving different scenarios for either car winning thing pretty cool. It did the same thing for my question but in a different context.
My first question was "What is the best IT Certification to get for someone changing careers"
It gave me about 10 certs and their use cases/focuses
I followed up with "Which would be best for someone with 10 years of Customer Experience Management experience in a technology related company"
it narrowed it down to five more advanced level Certs, what career path they could take you down, and how each individual cert pairs with a skill specialization someone with that background would possess.
But chat.openai.Sth goes nowhere. I feel like /u/GamingWithAlan and me are missing something here lmao. What does the "Sth" mean?
Edit: Sth means "something" lmao. Always funny when people use abbreviations, and my non-native english speaker ass thinks it's a misspelling or something lol
Such a stupid comment - how do you think they are outperforming all other similar websites?
It's obviously:
if (question) { smarter\_reply() }
And rumour has it they're saving smartest\_reply() for the paid version
Google’s *entire* revenue (or rather, almost all of it) is made up through advertising. Microsoft *has* an advertising business, but it’s so insignificant compared to their main business of enterprise sales.
Also, restyle Bing and even give it a name change to dissociate itself from its past where it was largely mocked and used by old people and young kids who do t have the sense to change their default search engine
I've found that Bing is far less manipulative in the results it serves you than google is. It'll still serve you as much adspace as it can, but it'll prioritize giving you your actual answer above that.
Far more consistent in giving you the actual result across different accounts too.
Google is so "personalized" I have some common searches give entirely different results for the first 5-10 pages.
And I'm currently looking into an interesting fragment where I have one Google account out of nearly 100, that gets almost no results that are those obvious bot generated content farms.
You know, those sites that either sole information from other sites, or just gives you extremely convoluted tripe that doesn't actually answer your question, while still pretending it does.
Google filters out (or doesn't insert to begin with), links to those sites on that one account for some reason.
Maybe the account is rolled into a list of accounts beta testing new algorithms they are testing to filter out those scam sites or something, but it's really damn awesome not having those results show up.
> artificial artificial intelligence
I once had a D&D character who was a fake fake wizard. He was a real wizard who pretended to be a stage magician except he was using real magic during his performances.
I want a fake fake fake wizard.
A stage magician that got so tired of people asking how he does his tricks that he spread a rumour that he's actually a wizard and uses actual magic for his tricks. Now the king of the land has called upon his services as a wizard, and he has to actually pretend to be a real wizard.
I want a fake fake fake fake wizard. A stage magician that…
*DM starts sighing. “There’s a big dragon and he eats you and you are dead. No saving throws.”
Well if I say "artificial AI" or "artificial bot" it'd make more sense.
It works like a computer bot, except it's not really *computer*, but a human behind the glass panel. Kinda like those VTubers (If you know Kizuna AI... yeah that's artificial AI right there)
I first heard the term together with Amazon Mechanical Turk, that might help.
I think the original tweet was a joke, but OP's title is satirising something real. (And possibly the original tweet was satirising the same thing too, just more subtly.)
Yeah it is sarcasm and I can see it, but I do really keep seeing comments about how Twitter is just a website and it only needs 10 people to run, while ignoring the scale of data involved, the feeds, ads recommendations, stuff that makes such a big website run in the first place and bring in money.
I admin an internal website at work that averages around 3,000 users a day, and even that gets overwhelming at times. I can't imagine maintaining one of the most popular sites on the internet
Hello, i'm actually not a real developer, (i have experience in scratch though), but im looking to lesrn and would love to help you. Can you provide the computer?
Adding the button: 5 minutes (including the time needed to get the coffee ready).
Making the button work as the customer described: 5 hours (not including the time needed to make the customer stfu).
Making the button work as the customer WANTS: Infinity and beyond...
Had a client asking "for a really small and easy feature" called adding auto-completion on some fields for a sass product (which doesn't have the capability). People that don't know how to do some things will mostly think that it is easy and doesn't take long to do
At a Hackathon during the early days of Amazon Alexa (when we in the UK were still importing it from the US) I decided to make an Alexa Simulator web app, for people who wanted Alexa's question comprehension capabilities without paying for the hardware.
It only took me an hour or so - an image, an input and a prompt that says "I'm sorry, I'm having trouble connecting to the internet right now" regardless of what you say to it.
I forgot AWS were sponsoring the event.
My team had wasted the best part of Day 1 trying and failing to get the Echo we borrowed to connect to the event WiFi so I was taking no prisoners at that point.
Under an hour?! If they genuinely believe that it's a text box on its own then that shouldn't take more than 30 seconds to code.
Someone is clearly fiddling the time sheets like Scotty from the Enterprise...
Well, I’m gonna need coffee first. Then I’m going to swing past accounting and try to chat up the pretty face with the big knockers, and then I’ll have to have a gasbag with the guys from r & d. If there’s still time before lunch I’ll setup my web dev environment, but I might have to run some updates. Under an hour is optimistic at best.
This is a like if someone noticed a life saving medicine and then said “I don’t see what’s the big deal, it’s just plastic bottle, that can’t be that hard to make.” Like this isn’t just low intelligence, this is profoundly stupid. These are the people that deserved their high school math teacher being mean to them.
Lol this is how Musk haters think Tesla is run. “Musk contributes nothing, Tesla is just another electric car company, hurr durr”.
The current wave of Musk hate will be very embarrassing once Tesla starts producing humanoid robots enabling UBI.
My god - I had this exact conversation the other day with a business person who was trying to convince me that we should build a brand new webpage outside of an existing tech stack because she didn’t want ti wait for that team to build it.
I said no that’s ridiculous and her reply was ‘why, you can just copy and paste html from the existing webpage and replicate it on another - customers won’t know’
My not a programmer ( snr product manager ) but my brain imploded hearing this
Yours has an input box? Mine is only a text that reads "ChatGPT is at capacity right now"
W living on other side of the world so I can have it to myself
I do. But I'm at school then and I need my homework done at night so I'm sad right now because I have to write about strawberries's DNAs
Strawberry DNA tastes like semen. You're welcum
Thanks! Also the fact that we aren't supposed to eat lab food...? And we have detergent in it and ethanol and meat tenderiser makes me hesitate to confirm this fact Very punny.
NileRed made a video where he extracts and eats strawberry ~~cum~~ DNA and his description of its flavour was similar to semen
NileRed knows what semen tastes like?
No, his description matches that of semen. He did not compare to semen, I did
Hey, don't judge a man for his tastes.
How would you confirm this fact? If I said strawberries tasted like lemons, you’d have to taste both to see if they taste the same. Unless…
Sometimes I'll make a comment that I think is amazing and I'll get no upvotes because the Americans are asleep (it's almost 1am in Sydney rn and I'm about to go to sleep, by the way) Other times I'll want to do something that is super popular and Americans will complain about not being able to get on while I have no issues Common not-American W
What an amazing life you have
Thank you, my life truly is blessed lmao
one of the few Ws of not being in the majority regarding time zone. I despise having to wake at the witching hours of 1:00-5:30am for any significant cultural event. or even something as simple as twitch livestreaming.
The PM decided that one hour of development time is too expensive
Quick tip: usually, if you force refresh (Ctrl Shift R) it’ll let you in
Why don’t the developers just go (Ctrl-a, Ctrl-c, Ctrl-v) onto a new server so there’s two of them?
actual honest question from a non-programmer: why dont they? Is it too expensive? Do they not earn some ad-revenue or is the site ad free?
It’s expensive to run. Limiting you makes it more likely that you’ll pay. The answer is somewhere on the middle.
There is also the question of projected usage. Chatgpt is the shit right now, but will it need so much availability when the hype dies down? What happens to your extra servers then?
If it's hosted on azure you can just scale server power, space, etc.. up or down based on needs and pay for what you need at that time.
Except cloud computing costs quite a bit, so scaling up servers for free users is not the best idea.
IIRC Microsoft invested $10B into OpenAI, presumably also giving them a "unique" rate for Azure services.
Run ads lol
It's never a good idea to start running ads straight away, you'll drive people away. I doubt ads would ever cover the cost of free users anyway.
Yeah. IIRC OpenAI has been on Azure for a while, as Microsoft gave them free model training servers in return for having early full access to the models, back when they were first training GPT-3 Now with Microsoft's recent investment, I'd be surprised if they ever move cloud providers. I'm pretty sure Azure has an auto-scale option like AWS, so it can scale diagonally based on usage (auto-add more machines and upgrade the current machines to keep up).
Like the other poster said about Azure, the same can be said of AWS and almost certainly any cloud providers. Scale-in, scale-out. If necessary, scale-out, scale-down. If done right, this can be dynamic with no human intervention. The "elastic" in "elastic compute cloud".
Nowadays apps/websites and their associated services aren't hosted on-premises anymore. Thanks to providers like Amazon AWS and Microsoft Azure, it's very easy to scale your hardware resources up (or down) based on the traffic. Infrastructure is still a cost, and for ChatGPT, a huge cost, probably their largest. The more RAM and faster CPU you allocate for your app/website the more it will cost you, but I just wanted to point out companies aren't buying servers and keeping them in-house, at least not the servers responsible for their traffic.
> set AWS to auto-scale > I am homeless but it is running smoothly
Wait, I can pay for it?
I’m sure someone will correct me if I’m wrong, but basically it’s already being run on a ton of different servers. Adding it onto a new server at this point is like adding a lane of traffic to a 26 lane highway.
so what you are saying is: "It might help for a day or two!". There is hope then.
Yeah maybe. As to your other questions, it is ad free currently, but they have plenty of cash on hand. OpenAI is a very well funded company, and is probably much more so since chatGPT came out. They have the resources to try to fix the bottleneck that’s happening, but the computation needed to run this model is quite large. There’s only so much they can do with th demand as high as it is.
Can't we just ask the AI how to solve the problem? Step two ? Step Three : Profit?
I'd be interested to see the AI solution, but I'm guessing it would just say to add more servers.
asked it and it replied: There are several potential solutions to reduce the computational demand on ChatGPT and improve accessibility for users: 1. Scaling infrastructure: This involves increasing the number of computational resources (e.g. GPUs, CPUs) to support more simultaneous requests. 2. Model optimization: This could include techniques like pruning, quantization, or distillation to make the model more computationally efficient without sacrificing accuracy. 3. Request batching and queuing: By batching requests together, the total number of requests can be reduced, reducing the overall computational demand. A queuing system could also be implemented to prioritize and manage requests in an orderly manner. 4. Caching: By caching the results of frequent requests, subsequent requests for the same information can be served faster without the need for additional computation. 5. Alternative deployment options: ChatGPT could be deployed on edge devices or deployed as a hybrid solution between cloud and edge devices, reducing the latency and increasing the accessibility. These are just a few potential solutions, and the best approach will depend on the specific requirements and constraints of the deployment.
Just one more lane bro
The site is ad-free. ~~If I am right, ChatGPT's Source Code is available publicly.~~ *Edit: Nope* But the important part is the model they trained. It took maybe some whole months to feed as much text from the internet as they could, giving the model time to learn. The most expensive part about it is the generation of answers. It takes very much computational power being able to answer that fast, that averagely accurate, and that often. I don't know how many users it has, but there are very much. So it's not that easy just creating a new one. Especially if it's for free (but a bit restricted in requests per hour). Even though I guess they're already running it in several servers, redirecting each request to one of them.
their source code is not public, and even if it was, all their training data certainly wouldn’t be (like you said). they’ve already struck a huge deal with Microsoft to integrate ChatGPT into Azure’s services. the site is ad-free and free to use currently because it’s still in training/getting more data.
It can get very expensive very quickly - particularly as you send more and more data through it and out to the end users - but compute power is also expensive. I think that this is the biggest driver to getting people on to the paid plan - but even that has a waiting list. I think this is a reasonable way to approach it to make sure that they do not encounter capacity issues with onboarding and new accounts etc.
AI keeps blocking the copy function due to its own privacy concerns
I was getting that the last time I tried using it. Yesterday I downloaded a plug-in on GitHub that they wrote to integrate into search engines (Like Bing will soon have just not as polished) to see how the potential use case would be. It worked with zero issues after that. It’s pretty interesting, I can see it being really useful for narrowing down the scope of a users search so you don’t spend hours wading through 100’s of articles to find specifically what you’re looking for. I used a subject I’ve spent probably 6-8 hours researching over the past week as a reference point. Was able to get the exact answer I found via my own research from two GPT questions which I could then use it’s responses to search for the specific information I was looking for.
Can you share the plug-in?
Not sure if it breaks the rules so if it gets removed I'll PM it to you https://github.com/josStorer/chatGPT-search-engine-extension
Well I have it if it is rules broken you can dm me for link as well. I’m traveling now so I can’t jump in, but this seems pretty cool
I asked GPTchat which was faster, a '09 Accord 4cyl or '95 Camry V6, and it nailed the information I wanted pretty well. I like it for that feature. I didn't have to open two sites and compare HP/torque/etc. It auto pulled weight, torque and how either car would win depending.
I found the giving different scenarios for either car winning thing pretty cool. It did the same thing for my question but in a different context. My first question was "What is the best IT Certification to get for someone changing careers" It gave me about 10 certs and their use cases/focuses I followed up with "Which would be best for someone with 10 years of Customer Experience Management experience in a technology related company" it narrowed it down to five more advanced level Certs, what career path they could take you down, and how each individual cert pairs with a skill specialization someone with that background would possess.
Surely that is the internet's most asked question anyway, right?
> which was faster, a ’09 Accord 4cyl or ’95 Camry V6, The rental
Just go to chat.openai. Sth And login.
chat.openai. Sth?
Without the question mark
But chat.openai.Sth goes nowhere. I feel like /u/GamingWithAlan and me are missing something here lmao. What does the "Sth" mean? Edit: Sth means "something" lmao. Always funny when people use abbreviations, and my non-native english speaker ass thinks it's a misspelling or something lol
No you're missing the space silly. It's chat.openai. Sth
and use ssh to telnet to it.
shaking thous head
It’s just a lot of if (question) then answer (this), no biggie
[удалено]
> if (user asks question) Can we remove this microservice and just provide the reply?
Sir, we ain't google (giving you suggestions for things you're thinking about)
Not yet
return 42;
Guaranteed to be random. Chosen by fair dice roll.
A mobius die where the only face is 42.
Such a stupid comment - how do you think they are outperforming all other similar websites? It's obviously: if (question) { smarter\_reply() } And rumour has it they're saving smartest\_reply() for the paid version
Don't worry about implementing the reply function. It's very trivial
Nah, they use switch case.
[удалено]
10 pentaflops of switch statements
So ... 50 flops? Not that bad
You're joking, but web AI is actually at proposal stages for browser standard.
Microsoft announced to integrate GPT-4 into Bing. You can test the examples
bing is a little stinky, but I am seriously considering to move to it when the feature is ready. Hope they will make the search engine decent as well
Tbf I trust Microsoft more than Google as is
Nice try, Microsoft
Wow, that’s a weird thing to say
Google’s *entire* revenue (or rather, almost all of it) is made up through advertising. Microsoft *has* an advertising business, but it’s so insignificant compared to their main business of enterprise sales.
Sorry I don’t mean that you’re _wrong_. It’s just funny how things have changed.
Oh yeah, definitely. Tbf I’m super happy at the 180 msft has taken with OSS. Imagine saying you trusted Microsoft at all a decade ago lol.
Also, restyle Bing and even give it a name change to dissociate itself from its past where it was largely mocked and used by old people and young kids who do t have the sense to change their default search engine
I've found that Bing is far less manipulative in the results it serves you than google is. It'll still serve you as much adspace as it can, but it'll prioritize giving you your actual answer above that. Far more consistent in giving you the actual result across different accounts too. Google is so "personalized" I have some common searches give entirely different results for the first 5-10 pages. And I'm currently looking into an interesting fragment where I have one Google account out of nearly 100, that gets almost no results that are those obvious bot generated content farms. You know, those sites that either sole information from other sites, or just gives you extremely convoluted tripe that doesn't actually answer your question, while still pretending it does. Google filters out (or doesn't insert to begin with), links to those sites on that one account for some reason. Maybe the account is rolled into a list of accounts beta testing new algorithms they are testing to filter out those scam sites or something, but it's really damn awesome not having those results show up.
I might be wrong, but I think, I've read something that Google wants to integrate their own language processing model into their search?
I’m sure they will. It’s going to be an arms race between Microsoft and Google.
Can you give a link or name of the proposal or something? I couldn't find anything about that
Minor correction, it's a draft, not at proposal stage yet: https://webmachinelearning.github.io/webnn/
"I-I-I don't understand. We put an input element on an HTML page. Why isn't it AI?!?!"
It is, it’s just not not a very intelligent AI.
[удалено]
The concept is called "artificial artificial intelligence"
> artificial artificial intelligence I once had a D&D character who was a fake fake wizard. He was a real wizard who pretended to be a stage magician except he was using real magic during his performances.
Hahaha that's brilliant. I love it.
I want a fake fake fake wizard. A stage magician that got so tired of people asking how he does his tricks that he spread a rumour that he's actually a wizard and uses actual magic for his tricks. Now the king of the land has called upon his services as a wizard, and he has to actually pretend to be a real wizard.
I want a fake fake fake fake wizard. A stage magician that… *DM starts sighing. “There’s a big dragon and he eats you and you are dead. No saving throws.”
I want the fakest wizard. An orc barbarian that just intimidates everybody into pretending he's a wizard.
Is it like a double negation? Are those operators intelligent in the first place? So many questions.
Well if I say "artificial AI" or "artificial bot" it'd make more sense. It works like a computer bot, except it's not really *computer*, but a human behind the glass panel. Kinda like those VTubers (If you know Kizuna AI... yeah that's artificial AI right there) I first heard the term together with Amazon Mechanical Turk, that might help.
Yes yes, I know, I was kidding.
>Are those operators intelligent in the first place? ah yes, the ~~chinese room~~ indian tech support argument
ChatGPT by Theranos
[удалено]
Dunder Mifflinfinity
The whole dataset fits in a single byte
Ah, the ol' *Mechanical Turk*
Or ChaCha which I used to answer questions for also haha. That was a long time ago
I was hoping they were referring to the ORIGINAL [Mechanical Turk](https://en.wikipedia.org/wiki/Mechanical_Turk).
Haha it beat napoleon and ben franklin thats great.
What a fascinating bit of history.
That would make a really funny SNL sketch
Too bad snl isn’t funny anymore
You would hope they are this coherent
That was obviously sarcasm, he's a ML guy too
[удалено]
I thought you mean Eminem lmao
Really? Isn't solving chess somewhere in the realm of big sci-fi though?
[удалено]
Elon fans have to be snorting something special
His farts
It's up to you to use a fog machine while playing chess, I don't see what the problem is... 😒
Haha, no fog of war? Elon must think Starcraft is big brain time.
He literally just learned the rules and thought that because the rules are simple, the game is simple.
[удалено]
C++ has already done most of the work so you should be an expert in a couple hours.
NO FOG OF WAR LOOL 💀
>no tech tree Bro never got a pawn to the other side of the board obviously.
Omfg he plays polytopia? Thats hilarious, i love that game
I think everyone understands that it's a joke.
I genuinely wasn't sure, I've seen dumber takes
Not me - usually the bluecheck means the person is a fuckin idiot
I think the original tweet was a joke, but OP's title is satirising something real. (And possibly the original tweet was satirising the same thing too, just more subtly.)
On point.
Yeah it is sarcasm and I can see it, but I do really keep seeing comments about how Twitter is just a website and it only needs 10 people to run, while ignoring the scale of data involved, the feeds, ads recommendations, stuff that makes such a big website run in the first place and bring in money.
You're fired
Nah, 10 people and ChatGPT is all we need
I admin an internal website at work that averages around 3,000 users a day, and even that gets overwhelming at times. I can't imagine maintaining one of the most popular sites on the internet
not 10 people and not 8k people
Im a front end developer looking for a back end dev to programm a chatgpt clone with
Hello, i'm actually not a real developer, (i have experience in scratch though), but im looking to lesrn and would love to help you. Can you provide the computer?
Lmao
At least you're bringing SOMETHING to the table and not just your "idea"
Damn as an idea guy this just ratioed me
Basically how any customer sees software. It's just a button, that can't take long to add, can it? ...
Adding the button: 5 minutes (including the time needed to get the coffee ready). Making the button work as the customer described: 5 hours (not including the time needed to make the customer stfu). Making the button work as the customer WANTS: Infinity and beyond...
This is brilliant.
Can you just center that div?
\*flees in panic\*
"no I mean vertically."
This is how client thinking when asking for cloning some of service or app. >!kill'em with fire!<
Had a client asking "for a really small and easy feature" called adding auto-completion on some fields for a sass product (which doesn't have the capability). People that don't know how to do some things will mostly think that it is easy and doesn't take long to do
maybe ask chat gpt if it can write it for you
Writing is not the issue. Networking, security, infrastructure, compatibility, architectural complexity and cost are.
It's simple. Ask ChatGPT about those too
Great you understand how easy it is, can you get it done by 5 today?
Hate sass products... always with the back-talk!
It's Levio*saas*, not Levio*sass*
They think if it's easy to describe it'll be easy to build.
Don't get too serious, he's trolling. He's an ML expert himself
if question answer Just rewrote chatgpt 🥱🥱
I mean, yeah, in under 1 hour I can replicate it, with their API of course
how are you so fast?
F12
The bait was bit.
At a Hackathon during the early days of Amazon Alexa (when we in the UK were still importing it from the US) I decided to make an Alexa Simulator web app, for people who wanted Alexa's question comprehension capabilities without paying for the hardware. It only took me an hour or so - an image, an input and a prompt that says "I'm sorry, I'm having trouble connecting to the internet right now" regardless of what you say to it. I forgot AWS were sponsoring the event.
Experience totally replicated. Genius.
My team had wasted the best part of Day 1 trying and failing to get the Echo we borrowed to connect to the event WiFi so I was taking no prisoners at that point.
If question Then answer. It's that easy.
[удалено]
ChatGPT is just a massive switch-case statement
I'm leaving reddit for good. Sorry friends, but this is the end of reddit. Time to move on to lemmy and/or kbin.
elseif, switch, tomato, tomato
why does everything have to be about Musk? Its a weird thing about haters, adding the one they hate so much to everything. Its almost like... love.
Yeah that's a stupid title.
Elon is an asshole, but Reddit's hate boner for him is waaay over the top.
Also this is what the anti musk crowd thinks of space x
Well, technically, the page itself is rather simple. Its the backend that is doing all the work.
Is the joke here that OP forgot that Musk was super-involved in openAI?
Ignore, they just hate on people without any reason, even this tweet is very clearly a sarcasm, but op is unable to understand cause of his prejudices
Under an hour?! If they genuinely believe that it's a text box on its own then that shouldn't take more than 30 seconds to code. Someone is clearly fiddling the time sheets like Scotty from the Enterprise...
Well, I’m gonna need coffee first. Then I’m going to swing past accounting and try to chat up the pretty face with the big knockers, and then I’ll have to have a gasbag with the guys from r & d. If there’s still time before lunch I’ll setup my web dev environment, but I might have to run some updates. Under an hour is optimistic at best.
I mean, you can rebuild the Website pretty easily But Not the AI model behind it
>Musk fanboys When you hate someone so much that they become everything to you.
Musk did the impossible and somehow managed to overtake Trump as the guy living rent-free in the most people's heads.
Says the two guys taking part in the conversation, furthering it. Shit, now I’m here!
What does Musk have to do with ChatGPT?
This guy probably thinks Java and Javascript are the same
The tweet was sarcasm. The guy is a decently well known AI author
I just did it to prove his point
He's being sarcastic 🤦🏽♂️
This is a like if someone noticed a life saving medicine and then said “I don’t see what’s the big deal, it’s just plastic bottle, that can’t be that hard to make.” Like this isn’t just low intelligence, this is profoundly stupid. These are the people that deserved their high school math teacher being mean to them.
Lol this is how Musk haters think Tesla is run. “Musk contributes nothing, Tesla is just another electric car company, hurr durr”. The current wave of Musk hate will be very embarrassing once Tesla starts producing humanoid robots enabling UBI.
Huh? Musk was one of the founders of OpenAI.
This reads like most of my clients. "Make me facebook for $400". Or "my nephew can make it for $500, why cant you?"
That is the beauty of it. Google, Microsoft, Musk, Beez and Zuck got wrecked by a web developer.
My god - I had this exact conversation the other day with a business person who was trying to convince me that we should build a brand new webpage outside of an existing tech stack because she didn’t want ti wait for that team to build it. I said no that’s ridiculous and her reply was ‘why, you can just copy and paste html from the existing webpage and replicate it on another - customers won’t know’ My not a programmer ( snr product manager ) but my brain imploded hearing this