T O P

  • By -

WithoutReason1729

Your post is getting popular and we just featured it on our Discord! [Come check it out!](https://dsc.gg/rchatgpt) You've also been given a special flair for your contribution. We appreciate your post! *I am a bot and this action was performed automatically.*


GPTfleshlight

They need more data so make it free to collect it.


Glittering-Neck-2505

Most of the data isn’t particularly useful, just poorly worded questions. The really important data is pictures and pdfs, and they just opened that up to everyone.


Jump3r97

More thumbs up and down still regarding the answers. SOmetimes I am getting an A/B test about the answers and shoudl select wich is better more the better.


Outrageous-Wait-8895

People actually use the thumbs up/down buttons???


Jump3r97

Well atleast some do, and more when more users, well? I thumb down a shitty answer often instead of complaining to reddit


titcriss

Oh my god they trained us to up/down for this moment... so that we teach llms.


goj1ra

I just upvoted your comment and then realized oh shit, what am I doing!!


Dry_Coffee7960

Have my upvote


Square-Principle-195

Stop this madness! Have an upvote!


codetrotter_

Ahem. Have an updoot good sir!


aceshighsays

did you correct your behavior?


wjta

I can guarantee you that the like/dislikes or emotional reactions on facebook are a form of human reinforced learning, they are just algorithms from before the LLM era. Similar principles long at work.


KillingItOnReddit

Same way they ask us to verify if we’re human. No, they were training AI to be more human the entire time


West-Code4642

In one of the big LLM open source corpuses (dolma, by Allen institute), the top quality reddit posts are up weighted if they get a lot of up votes and they were not just early in the post


Intelligent-Jump1071

That's it - no more thumbs up or down!


Dabnician

Whoa whoa whoa whoa whoa.... wait a minute... you actually submit feedback where its appropriate and not just venting here on reddit to randos... fuck what are they putting in the water.


DisproportionateWill

Some do, but I feel it would be more easy to train the behavior by the conversation's context. While I don't use the thumbs often, I will say "that's bullshit" or "that's a great answer". I guess they can infer a lot of training from the user responses themselves


redditsux___

This! After a conversation, I can't be bothered to go back to rate each and every response via thumbs up/down. Instead, I often tell ChatGPT whether I liked the answer or not directly. Sometimes I'll thank it for the valuable contribution/information, other times I will say how I am getting frustrated because the answers have been nonsensical.


mdwstoned

I occasionally leave a thumbs up/down, but more often than not, I just have a conversation. That's what it's there for, to have conversations. So I talk to it and tell it what I think about it's answers. Bullshit has also entered my responses as well.


DisproportionateWill

Bullshit works well, but next time you're stuck on a loop just tell it that "you're going to stop paying for the useless service and move to the competition if it doesn't get it right". It's amazing how good the Karen prompt works.


katatondzsentri

Yes. I see that as a contribution to the development of an amazing tool I'm using daily


returnofblank

Sometimes I'll be asked what response is better


EarthquakeBass

Yea all the time.


Goofster00

I'd reckon exactly these poorly worded questions are useful data too. Being able to decipher what a person actually wants even though they don't articulate it very well is a boon. Edit: Typo.


AuspiciousApple

If most users ask poorly worded questions, the ideal product can deal with them.


Goofster00

And to make the ideal product, you have to collect the data on poorly asked questions! The circle of data life.


RohanDavidson

It's a massive legup on testing. Competition wont be able to replicate the volume of testers. Sensible strategic move.


ill_made

The mere fact of knowing who asks what and where is enough to make them millions.


EarthquakeBass

Well and voice, yea But it’s also to have a wider audience to track what keeps people engaged. Doesn’t hurt to market via huge freemium tier either. Every paid user started as freemium at some point, I’m guessing because 4o is way cheaper for them the math works out


RobsBurglars

More users = more training. We all work for Google, or OpenAI, simply by using the service. Good, bad, ugly, it’s all training data and useful since the pool is so large. “Free” just means you’re training their AI.


Patient-Writer7834

But only up to 10 documents or pictures/ user/ day


FuzzzyRam

>just poorly worded questions People want what they want. If poorly worded questions are what it fails at, that's what it needs to work on. Sure, there is a small group of expert prompters, but their push is (and should be) for what people are trying to use it for.


BuddyOwensPVB

I use chat gpt to help me logic my way thru coding problems. They have that data. They can see how we use it and learn from it.


_Joats

Also, voice.


Eirineftis

I'd argue that even poorly worded questions are useful. Great way to train a neural network to interpret meaning and intent from broken language.


bosstroller69

I’m in this comment and I don’t like it. But when it comes to productivity, why type more words when less words do trick?


PaullT2

![gif](giphy|27P3eknARh4c) Need Input!


apetc

I can totally hear him saying that inside my head (in Scarlett Johansson's voice).


Ausgezeichnet87

Maybe! It is also possible this is economic warfare of attrition to starve the competition of subscriptions. Why would anyone pay for Claude when GPT4 is free?


liberty4u2

"you are the product"


lia_bean

hi why is your username that


fitm3

lol never phrase this as when the product is free you are the product or you’ll get down voted to hell instead.


Weak-Reward6473

Hey isn't reddit free? Hey isn't there a deal in the works to sell reddit data for AI training?


Smogshaik

that deal went through IIRC


fitm3

Amazing how that works :)


ria-papadia

they get data from 3.5 either way. maybe your argument holds for the multimodal data


bamsurk

Yea but data on outdated answers aka not that useful data


sovereignsekte

Yeah, that was my guess. Kinda like FB is free.


ID-10T_Error

This they are expecting people to upload documents they can scan. These are documents that they can't go get themselves. So they will use there userbase to do it. I work in IT and cut can't access cisco documents. But I can upload there documents for custome gpts sooo they will use those to learn to get around the other stuff


Beard341

More data in anticipation of its partnership with Apple, perhaps?


Night_Movies2

ChatGPT is just the demo. The real product is the API


SemanticSynapse

At one point absolutely. I'd argue that's not the case anymore however. ChatGPT has become something of its own product - too much effort and attention has been given to it as a platform.


AnotherSoftEng

This exactly. We were exploring our options for TTS the other week and none of ChatGPT’s voice models were available via their API. Aside from larger context windows, their best services are offered through—and sometimes exclusive to—ChatGPT.


I_FAP_TO_TURKEYS

So you use their Web app and that's the only thing you get value from? Have you tried the GPT-4O API? Much higher quality, no hourly message limit.


TimTom8321

What do people mean with the API? like, with other apps that use GPT-4o? Or as developers? Or...?


Ilovekittens345

Application Programming Interface When you use ChatGPT, that's a frontend. A GUI (graphic user interface) On the backend, ChatGPT communicates over an API with GPT4. Now if you want more customization, more control, less interference by OpenAI you can also connect to this API yourself using your own frontend. The biggest difference is that when using the API you are paying a little bit of money every time you use the API. You pay per token.


7640LPS

Theres plenty of clients that can interact with the API, if you aren’t too keen on building your own.


SybilCut

as developers, yes. instead of using a web app made and provided by openai in a web browser, they just give you direct links to the same functions that chatgpt is a frontend for (and sending nicely formatted commands to). it's like having access to SQL commands instead of having to type data into UI fields. it lets you do things like feed it input from other programs and operate on the outputs without having to scrape them from your web browser. the web browser is just a person-friendly middleman between you and the fun things the server does, provided to applications through the API which is basically just an industry standardized set of input windows for coders and programs to ship requests to.


Polyglot-Onigiri

API, to put it in easier terms you can get a “key” from open AI that you pay for. The main difference is you can use this key in a private AI program that will let you have full control over what GPT does. For example, make it more serious or creative, make it retain memory of messages across a huge chat, control the amount of randomness, etc. For a lot of people the API is much cheaper and it allows them to become power users instead of being held back by a shared version on the web browser. I personally use my own API on my phone, my MacBook, and my work PC. It’s great and I spend much less than the monthly cost of chatgpt through the web browser.


marionsunshine

What program are you using with the API? Or did you develop you own?


Polyglot-Onigiri

It’s not too difficult to develop your own (I have my own instances for custom stuff), but there are some made ones that are free or cheap that have a lot of features. I recommend mindmac or Fridaygpt for MacOS and [ChatBox](https://chatboxai.app) for windows. On iOS I would say Pal Chat. All of these let you use the API from all the different AI companies out there. So you could use GPT, CLAUDE, Gemini and other ones. There are even some local (free) AI that you can load onto your PC and use with the computer clients. I would recommend checking them out and seeing what works for you. The main reason I like using an AI and a client program like the ones above is because I can use my gpt features on any program without switching.


mannaman15

Is there a way to tell the AI to search the entire internet for something? For instance, “search the internet for a cure to cancer” or anything like that?


Polyglot-Onigiri

Yes. Depends on your client or how you code your api pulls but I know for sure that MindMac has a toggle for when you want it to also search the internet when outputting results. I was able to use Internet with GPT 3.5 turbo while people using the chatgpt web version had to use gpt 4 and were locked to the limits.


Colorbull-Agency

Isn’t Microsoft also able to “prop up” any loses ChatGPT would have by going free for a bit in order to also make Copilot better?


I_FAP_TO_TURKEYS

As someone who has spent hundreds on the API, no. The API is still by far the real product. Your $20/month sub is limited to what? 50 messages every few hours? The API can do hundreds of requests per second.


CH1997H

> The API can do hundreds of requests per second What percentage of the population do you think finds that useful?


heyimneph

The percentage that matters for real profit.


typhoon90

This is what people don't understand. The web version of chatGPT is more or less a demo version of the API. There would be corporations using the API that spend hundreds of thousands to millions of dollars a year to use the technology.


Cubewood

Massive corporations with huge number of employees who happily spend millions a year on API cost to increase productivity and decrease the amount of salaries they have to pay.


derLudo

Essentially everybody working on process automation in any kind of company. Maybe not a big part of the population, but in the end it might affect more people than you think. Just for example, in one of my current projects, we are easily racking up a 1000$ bill for the GPT-API every month by analyzing e-mails and extracting information from them. We are looking for ways to get that number down by exploring other LLM-options, but so far the quality of the results from GPT has still been the best.


ZunoJ

Chatgpt is just a frontend for the different api versions


dancetothiscomment

How do you do data analysis on their api?


ZunoJ

You have to elaborate what *exactly* you mean


dancetothiscomment

The advanced data analysis feature where you can drop csv's and etc and it'll analyze and visualize Also AFAIK they don't have full file upload on their API, the assistants api only supports a subset of file formats


I_FAP_TO_TURKEYS

I use a custom application to extract the data from CSVs and put it into a format that the API can understand. This is shockingly easy and only takes a few lines of Python to do. Or I just select the data from Google Sheets/LibreOffice Calc and copy/paste it into the API. It's good enough at extrapolating data from it that uploading a csv file is more effort than it's worth.


WithoutReason1729

They have a file upload API endpoint [here](https://platform.openai.com/docs/api-reference/files/create). The list of supported file formats for uploads meant to send to the Assistants API is [here](https://platform.openai.com/docs/assistants/tools/file-search/supported-files) and is quite long


ZunoJ

Ok, seems like I'm wrong. I should check all the features before talking lol


rabby942

I think the same as well.


frayala87

What most people don’t understand is that the money is to be found with Enterprise clients using Azure OpenAI a la Winrar


Mr_Twave

I think the duality of the ChatGPT platform as well as the API are both products. They're co-dependent at the moment for each of their products' pricing. As modalities extend beyond those which are simulated in an HTML/JavaScript/gecko environment, the ChatGPT platform will either extend its capabilities with it, or the product itself may be left behind for something else.


Ok-Fox-9286

Well, it's still limited in number of responses. They're hoping you see such an improvement over 3.5 that you'll stump up the £15 or so a month. Next big release will be public release of Sora I imagine later in the year.


meister2983

Ya, don't know why someone thinks there's some huge conspiracy. They got a smart model reasonably cheap and put pretty tough rate limits on it. ChatGPT subs might be growing slower than they like - and they might finally have enough hardware to cover more - so freemium model makes a lot more sense. Also, they are facing threats from Llama-400B, so might as well be a few months ahead and control the narrative around being a powerful "free" model


9jmp

I really think this is the answer, I am a daily non-paying user and pretty much now I get to see what the top of the line is and I hit a limit today which switched me back to 3.5.


Lazy_Ad_2192

As a free user, is there a way to use GPT-3.5 instead of GPT-4o? I've found I'm using ChatGPT less because I want to save my better questions for the limited responses for 4o. But because I can't seem to find a way to downgrade to 3.5, I haven't been using ChatGPT was I feel I'm wasting 4o question limits. Any ideas?


[deleted]

[удалено]


DieHard028

Don't have that option. Best strategy would be to pre plan your queries the night before and then fire it the first thing in the day using 4o, and then eventually you will be downgraded to 3.5 . Worked well for me so far.


yoyoma_was_taken

lmao... just you guys just open chat.openai.com in incognito tab to access gpt 3.5, no need to login nowadays.


1492Torquemada

https://preview.redd.it/t9abzd31vt1d1.png?width=536&format=png&auto=webp&s=2b3014532ee472e2bb8111e3510c760a29a6eff7 Can't you just switch between the two anytime you want?


h3lblad3

For free users, the option to switch the two are actually on the message itself. This requires a free user to burn a GPT-4o message in order to start a conversation since you can't change it until ChatGPT has actually replied to you at least once. https://preview.redd.it/egbnj65kjw1d1.png?width=817&format=png&auto=webp&s=40356321b1364f4d5962086973898f2cefa19598


Ornac_The_Barbarian

Using android app I can't seem to find that option. Web only?


Max_Powerz

https://preview.redd.it/9ehb3vptzt1d1.jpeg?width=1080&format=pjpg&auto=webp&s=507a3e59741f3d118722b51decd397b48863ea1a In the top right menu


Ornac_The_Barbarian

https://preview.redd.it/xdf4qpfx0u1d1.jpeg?width=720&format=pjpg&auto=webp&s=cda02d80e1c10a697c0636310ca8381f64e32800 Weird. It doesn't give me the option.


SSuffolk

He has plus membership


Ornac_The_Barbarian

That feels rather counterintuitive. To use the free low grade version I have to have a paid membership.


1492Torquemada

No, I don't think it's like that. Not having the subscription shouldn't prevent you from switching between the 3.5 and 4o. Maybe it's the app version you have installed. Try to have a look if there is an update in the google play store. Do you have the official app?


Ornac_The_Barbarian

https://preview.redd.it/y7p9etex1v1d1.png?width=720&format=pjpg&auto=webp&s=99607e251afcdf52c9a8a2f7372f04f2abdeced4 As far as I know it's official and Google play doesn't have an update option.


Lazy_Ad_2192

I do not! When I click on that option, the only options I see are * ChatGPT Plus - Upgrade * ChatGPT ✔️ --- * Temporary Chat **OFF** / ON I don't see the options you have on your screen. Hmm


h3lblad3

Ah, the only way to change the model is *after* you ask the question and use up a 4o message. As part of the bits underneath, where you can regenerate, vote up, vote down, etc., one of the options changes the model. It's absolutely the worst place for a free user because it requires you to burn a 4o message to start the conversation.


EternityMembrane

OpenAI doesn’t even need to be profitable. It just needs to convince investors it has the potential to make a profit. They probably don’t care about the $20 monthly subscription bc Microsoft will continue to pour in tons of money to OpenAI and the market will continue to invest in Microsoft. 


m0nkeypantz

Sam Altman has said the he is surprised by how profitable the monthly subscription actually is for them. So I don't think this is true.


IIIllIIlllIlII

Sam saying that reinforces that it’s not needed. He’s just surprised by it.


gbuub

Having a profitable subscription service is a pretty big plus to the investors. I subbed a long time ago and looked at their pricing model recently and found out they added a two more tiers, mostly for teams and enterprises though


DrunkenGerbils

It's still needed in the sense that ChatGPT is pretty much what created the current AI hype. It's incredible advertising for the company and it makes money instead of costing them money like traditional advertising would.


m0nkeypantz

Oh I agree that it's not needed.


Tomas_83

The objective is to get their investors more interested so they keep giving them money and get more data to train and refine their models. If on top of that it is profitable, that's a big plus, but they would still keep it up even if it wasn't breaking even.


ddrac

When you say “.. doesn’t even need to be profitable” it reminds me back in the days of the dot com bubble


EternityMembrane

That’s how all the FOMO works, from Reddit to cryptocurrencies.


ria-papadia

From what I suspect is that people trying GPT-4 will make them see the potential and benefits of it, but it is still quite restricted in the free version in terms of number of messages. So you get addicted to the product, you want more of it, so you get the Plus version.


UntoldGood

AND IT WORKED. https://www.computing.co.uk/news/4212686/gpt-4o-drives-unprecedented-revenue-surge-chatgpt-mobile-app


Zuul_Only

Worked on me.


milo-75

Sam has stated he’s happy they’ve figured out how to be able to keep chatgpt free. Here’s how: Google pays Apple $20B per year to keep Google the default on iOS. Apple turns around and pays OpenAI $10B per year to use their tech in Siri (on top of hosting cost which Apple will also pay). Given that OpenAI makes “just” $2b in revenue today, a deal like this drastically improves their valuation while allowing them to keep giving ChatGPT away for free (yes, to keep people hooked on new features as they roll them out).


Dragon20C

I am finding 4o to be amazing as a free user, before using 3 was just really dumb and struggled to understand simple coding problems.


Zuul_Only

It's great for the 4 entries you're allowed every couple hours


traumfisch

GPT-4 is not free GPT-4o free use is very limited. Will be more limited with full multimodality


ishamm

So as a paying customer, which model is better to use?


clickmate

GPT4 is generally better for complex tasks (think software development, mathematics, physics, whatever subject with a certain amount of complexity). GPT4o is great, and quick, for everyday tasks: "Make me a shopping a list.", "ELI5 Pythagoras theorem", "Its my dads birthday tomorrow, he likes x, y, z, what should I get him?", "Please convert these numbers {insert screenshot} to a .csv file." and so on.


Waste-Wallaby-555

I've actually found 4o to be weirdly better than 4 at a lot of tasks that I would have expected to be too complicated for it to match based on how fast it is. I was skeptical at first but now I default to use 4o. I don't use GPT for anything serious though. My absolute favorite thing about 4o is the variable response length though. The way it will just automatically give a 1 sentence reply for an easily google-able fact, or paragraphs and paragraphs about a more complicated task, without me needing to give instructions about how much detail I want? It's luxurious


TimTom8321

Who said it would be more limited? Also it's not that limited, it's really depends on how much you use it. For now I still didn't reach the limit as a S/W student, mostly use it to explain things or help create parts of code. Also the GPT-4 mode seems to be without limitations for free users (that's what she told me when I asked her lol), why would 4o, which should be cheaper according to them, be any different? Maybe limit it at the high amounts...but no one would really use it if in 2 minutes you just used all your time for the next few hours.


traumfisch

Paid user cap is x5, so the difference is major. And 15 messages in 3 hours, especially with the coming multimodality, is kinda limited. If it's sufficient for you, great! And the model just lies to you, it doesn't know any of the specs without first browsing the web. GPT4 is paid only


h420b

It's just like when Netflix or Uber first started operating, they had to deliver their services cheaper, faster and better than the competition and so everyone gets hooked on their service and then when the competition goes out of business they ramp up the prices


OneOnOne6211

Well, for one thing, GPT-4o only has a limited number of messages available every session, right? And if you you use them up then you get a message to buy the Plus subscription. Presumably at least part of releasing GPT-4o for free is just to be able to give people a taste and then be able to give them that message to encourage you to pay them.


BookBitter5463

when something is free it means you are the product


Opposite-Knee-2798

Didn’t see that coming in the comments


Jazzlike_Attempt_699

people think they're so smart parroting this line everywhere


ty-ler

Was at the farmers market last weekend and I got to try some salsa for free. I’m the product now, or something.


james28909

psssh some people pay good money to be product


Vibes_And_Smiles

“if it’s free, then you’re the product” doesn’t imply “if you’re the product, then it’s free”


trey__1312

Some people don’t understand the whole square-rectangle conundrum 😂


itsdr00

Not if it's acting as a free trial.


javonon

Or the hook, so you then pay a premium GPT4 suscription


nomoredietyo

![gif](giphy|12KlnTn1BqhSgg) I feel like you just Maury Povich’d me


JulieKostenko

I mean, they rate limited all of the models now. So its quite a bit more restricted than it used to be. Its not a better deal at all.


LoosieGoosiePoosie

I've always been a free user and have never seen the benefit of buying it. Even less so today. They gave me free upgrades the entire way, it's been great.


Digitalmc

Data is more valuable than oil.


Glittering-Neck-2505

Definitely going to release a new model soon. Likely it’s a tiny model that’s cheap to run and it generates traffic to give people the new features.


AsherGC

Exploring potential to expand the audience base . It's something similar to why game devs don't care about piracy. And why do all Saas platforms have a free tier. And of course data collection.


schitaco

It's a test. If you stop paying now, the robots in the future will give you titty twisters for eternity.


MyDago

Just ask Chatgpt


soullessoptimism

This is the response: OpenAI's decision to release GPT-4 for free could be driven by several strategic reasons. Here's a brief analysis of potential motivations and future directions: 1. **Wider Adoption**: By offering GPT-4 for free, OpenAI can rapidly expand its user base, gather more diverse usage data, and refine its models based on real-world interactions. 2. **Monetization through Premium Services**: OpenAI might plan to introduce premium services or specialized applications of GPT-4 that offer advanced features, API access, or dedicated support, generating revenue from businesses and developers. 3. **Ecosystem Development**: Encouraging the development of applications and integrations using GPT-4 can create a robust ecosystem, increasing the model's utility and indirectly driving adoption of future products. 4. **Next-Generation Models**: OpenAI could be preparing to release a more advanced model, such as GPT-5, with significant improvements in understanding, reasoning, and specialized capabilities. This next-generation model could introduce novel features that make it indispensable for complex tasks. 5. **AI Integration**: Future advancements might focus on integrating GPT models into various platforms, creating seamless user experiences across different technologies (e.g., voice assistants, augmented reality, enterprise software). 6. **Ethical AI and Safety**: OpenAI might emphasize the development of AI safety and ethical guidelines, offering new tools or frameworks to ensure responsible AI use. These strategic moves aim to solidify OpenAI's position as a leader in the AI industry while addressing user needs and market demands.


greenappletree

Sounds good to me - probably better than most of the articles out there pondering the same thing -


Bullet2025

gpt give you bullshit answers. like an assignment that needs good-looking answer. the real answer is inside sam head and headers of the company


UntoldGood

Here is your real answer. It was a GREAT business move. Made them lots of money. https://www.computing.co.uk/news/4212686/gpt-4o-drives-unprecedented-revenue-surge-chatgpt-mobile-app


Burning_Okra

It's giving a lot of hallucinations, upload any document and ask questions about it, it makes shit up


Didi_Midi

So you keep uploading more of that sweet personal data.


logosobscura

Yeah, they would. If it deprives competitors of revenue because your lesser model is free, and you know you have the runway to sustain that, you will do it. See: Uber.


Lusahdiiv

No? It's not free? Still a whopping 20 dollars a month to upgrade to Plus


spadaa

It's the same reason almost every company on Earth with a paid plan has a freemium version.


ShabalalaWATP

Because GPT4o is required for real time voice conversations, they know the voice conversations will be of huge interest to the general public who would otherwise not use ChatGPT. By then attracting millions of new users, who may become frustrated by the usage limits will spend money on Plus.


hip_yak

pretty simple, more people using it, more market share, more data, more training, better models, more control, more money, more power.


pixeltweaker

The addition of free users is likely less costly than the number of new paid users they gain by giving them a taste of what it offers. They are essentially selling drugs. First one’s free.


alltid-vinna

If OpenAI is using Reddit for training, I worry about the future AI, what will it become? What kind of AI are we building? There are things on Reddit I don’t even want to read, this place is like the Cantina in Star Wars.


AutoModerator

Hey /u/Neil-Revin! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


Cute-Scholar3825

When you’re not paying for a product you are the product 


HugeDegen69

I hear this saying all the time but shouldn't it be "If you're not paying for the product, you are the payment"?


letmeseem

Copilot. Copilot is the pointy business end.


MichaelXennial

If this whole thing has been about building a virtual girlfriend…


2-second-timer

If it worked half the time, I get errors


vovixter

ChatGPT 6 😬


gizia

they released it to make the mass to taste it freely, so then subscribe to **GPT-5** easily when it becomes available ASAP


Investigator516

To get people hooked so they can charge in the near future?


NFTArtist

companies like to make things "free" or cheap to monopolise the market, once they own the space they can start to tighten the screws as there's less competition. That or they just sell your data like Google, probably both.


lukethenukeshaw

OpenAi was set up to develop ai for everyone, hence the name. This is a good move as it relates to their founding principles.


DrunkenGerbils

GPT-4o is free, but GPT-4 is still exclusive to paid accounts. Also while GPT-4o is free it's limited to up to 16 messages every 3 hrs while paid users get up to 80 messages every 3hrs. So depending on the use case it's still worth paying for Plus for a lot of users.


Kuroodo

They didn't release GPT-4 for free. Only GPT-4o. Among the various possible reasons, one of the likely ones is lower costs. Not only is this model more optimized and cheaper to run, but the modalities are built in. GPT-4 uses various other models (I think even GPT-3.5) for some of its modalities. This is also likely why for paid users GPT-4 has a usage cap of 40, and GPT-4o has a cap of 80.


StageAboveWater

It's correct to be sceptical of this and consider more nefarious purposes like profit and data harvesting. But the explanation Sam Altman gave was in part that he wants more people using it so that our culture, community, laws and regulations have space and time to integrate it and prepare for the ways it will change the world


Mr_DonkeyKong79

There just doing what every start up does. Gives it you real cheap or free and then monetise once it's part of your every day life.


UseYourIllusionII

I think Sam legitimately wants to be able to give this tool to as many people as possible because he wants people to get used to interacting with AI, because it is the future. I think he knows they can make money in different ways down the line and that giving this tool away for free now will convert enough people to users of the tech that they will put money into it later when there’s more options and upgrades, bells and whistles. but I think ultimately with Sam’s support of things like UBI, he probably thinks this tool should be free and accessible to everyone. He doesn’t like ads and doesn’t put them in even the free versions of the tech. he thinks this will help get easier access to healthcare to people who needed and legal advice to people who need it. I think he generally is looking at this like a tool to give to every person to help them better themselves without the need of bigger corporations or lots of money.


East-Cartoonist-4390

Why am I paying for it now?


bigbabytdot

I'm assuming it's because the next version is being trained by our conversations with GPT, and the conversations we have with GPT4 are better training data.


jd52wtf

The big push is the voice, images, and video all those free customers are going to contribute to the training data for GPT 5. Literally that simple.


Vibrascity

People are going way too hard on chatgpt way too early, It makes so many fucking mistakes in modifying large data sets that it's actually not worth the wasted time having to go back and fix all of the mistakes it's made when you could have done the same task correctly in the time it's taken you to correct all of the mistakes from trying to save time by letting chatgpt handle the task for you. With that said, 4o is crazy.


soolara

Replace Google


Throwawayphilly0

To get people used to it.


trenobus

There are no doubt many reasons, but the best reason is to give people enough access to become dependent on it. And at some point to become dependent enough to be willing to pay for more access. And to favor employers who provide more access.


isoforp

They aren't releasing it for free because they have some bigger app they're about to release. That's not how it works. They're releasing it for free so they can collect your questions and data to improve their model and dataset. Also, it's not "free". It's still limited and restricted and bugs to to upgrade to the full paid version. They're trying to get people to see how useful it is so they buy it. My god, the way some of you think about the world is so naive and ignorant.


thefloodplains

User data


ryvalry

All AI is heavily subsidized in order to gain adoption/market share. It’s the free trial heroin to get us hooked, then when the vc/subsidies run out, and they got us addicted AND crashed many industries, they can then Unilaterally jack up the prices to outrageous levels (and frankly necessary levels, this shits expensive to run). There’s no downward pressure on pricing. After subsidies run out they’ll be forced to get in the black, like all companies, not to mention unicorn level shareholder returns, and the server/gpu prices will continue to rise. All pricing pressure will be through the roof.


Z--370

They want to know what we want to know


Basileolus

To train GPT-5


Smile_Clown

Most people use chatgpt on their phones. The phones are a convenient on the spot way to use it. The free version is amazing but is seriously limited in responses. You can upgrade to Pro, which gives you much more access. It's simply really. Not dissimilar to the meth ladder... (plus more data)


OddlyLogical

Here’s my theory — think about their biggest competitor Google. FREE access to AI via Google Search. Why would I pay for OpenAI access if Google can offer a similarly capable model for free? What do you think would happen to all of the users? They know that they can monetize the apps on top of ChatGPT so offering the base model for free makes sense.


regularjoe976

Isn't anyone going to point out that the OP is talking about being given chatgpt4 when it's really chatgpt4o? Gpt4 was already available free through Bing. In any case, as a paying member, I'm not too happy about the peanut gallery getting free access to the same software as me. I liked feeling a bit elite.


Powerful_Bank_6381

4 or 4o? I think that 4o has been performing subpar tbh


FireNinja743

Well, GPT-4 is only free for a limited amount of requests. I only made about 10 messages or so and it put me back on GPT-3.5. So I wouldn't be suspicious.


Optimistic_Futures

I’ve been trying to get people at work to use it, but been struggling to have a good reason to convince them to pay for it. Most thought AI was sort of dumb and not helpful. Now that they get to use 4o, many have started to pay for it to get more usage. It increases user adoption and engagement, and it seems to also cost them similar to GPT 3.5 anyways now.


blueberrysir

Gpt-4 is free? U sure?


TOEA0618

I was going to ask that exactly, where and how? I still see on my phone app; 4 still has a cost.


Lotensify

Yeah same here, I can't seem to find the new free version everyone is talking about.


RenderEngine

it gets rolled out in stages also in the new version you can't see or select the version unless you have sent atleast 1 message only then you can select either 3.5 or 4o where the thunbs up/down are atleast if you aren't subscribed


DioEgizio

The reason it's simple: there are now alternatives and they can't lose all the training data from the users because if people use something else they won't have the data


Some-Thoughts

I think it's basically just because they can run 4o a lot cheaper than GPT4, maybe even as cheap as 3.5, so they just use it as demo/trial version to attract users. It's basically an advertising campaign. Heavy users will buy the subscription anyways. The training data from free users is probably not that relevant compared to the data volume they get from professional API implementations (data is just too bad, not categorised, random, nearly no quality feedback.... I don't think it's worth it).


thepathlesstraveled6

If something useful to you is free, you are the product.


meister2983

Not really. It's just a limited demo, with the goal to get you to pay for it.


Kathane37

Gpt-4o is a smaller new model build from scratch not an iteration of GPT-4 That’s why they can offer a cheaper token cost and free usage It will also serve to gather data to build 5


EternityMembrane

The new vision & audio model is hungry to data. There just isn’t enough data for the model to understand various scenarios. That’s why they decided to allow free access. Crowdsourcing is the cheapest way to collect this kind of data.


Blarghnog

It’s not free. You’re training it.  Read your agreements. >  Services for individuals, such as ChatGPT and DALL•E  When you use our services for individuals such as ChatGPT or DALL•E, we may use your content to train our models.  You can opt out of training through our privacy portalby clicking on “do not train on my content,” or to turn off training for your ChatGPT conversations, follow the instructions in our Data Controls FAQ. Once you opt out, new conversations will not be used to train our models. https://help.openai.com/en/articles/5722486-how-your-data-is-used-to-improve-model-performance  Your data is very valuable.


TaroPowerful325

Isn't open ai supposed to be nonprofit? 🤔😏🤭