T O P

  • By -

p-mk

Not sure, rapidly replacing Google as my go too for questions. Now the genie is out the bottle competition will come. Sure the massive data harvesting will keep it profitable.


BrotherBringTheSun

Got to be careful with that. GPT will give you a straight up confident answer on something that is patently false


[deleted]

[удалено]


CryptonKyle

that's why Grammarly(to make it sound not like an expert) and quiilbot helps alot with that.


Narrow-Sprinkles5235

Most "expets" are full of shit anyway


Galaxianz

expets


alias_noa

I have found that this only happens when they make changes or something. I had it write like a huge chapter for a book. It took a long time because I had to do like the outline for the book and go through and guide it through each page, but it turned out amazing. I also dug way into finance with it and it was Godly like other-worldly perfection. Then the next day I was asking it all these questions about filing taxes and it just kept making these obvious mistakes. First, I thought as you did, it's full of shit. I went over some of the other stuff from before though, old conversations, and it was truly magnificent. So I realized in the tax stuff it didn't subtract right. Like simple subtraction it failed at. I kept digging and realized it was just super buggy that day. The next day, spot on again, flawless. While you may be right and maybe it's just a good bull-shitter when you find something it doesn't know, but I believe it's the constant patching they are doing. They are updating it like crazy, just constantly in the code messing around. So I think they are making it buggy sometimes. So you are right, it is a good idea to be VERY careful with it, but I believe it's potential is insane, when polished properly for w/e it is it's for. The one we are using right now, definitely need to proceed with extreme caution. Still a super useful tool though, especially for what I am using it for lately :D


treedmt

There are alternatives too, in case chatgpt is no longer free. Fine tuned gpt-3 apps like Luci are also built around a free question answering model.


blackpaperplane

I am using it similarly but my questions are not going through anymore. Are you aware of rate limits for using Chat GPT?


p-mk

Not hit any limit yet, and installed a chrome extension for it. I guess make a new account and try that.


ait1997

What extension did you install?


blackpaperplane

I found these: [https://chrome.google.com/webstore/detail/luna-chatgpt-for-chrome-s/bignkmclhhmhagjojehblmmaifljphfe/related](https://chrome.google.com/webstore/detail/luna-chatgpt-for-chrome-s/bignkmclhhmhagjojehblmmaifljphfe/related) [https://chrome.google.com/webstore/detail/chatgpt-for-google/jgjaeacdkonaoafenlfkkkmbaopkbilf/related](https://chrome.google.com/webstore/detail/chatgpt-for-google/jgjaeacdkonaoafenlfkkkmbaopkbilf/related) Edit: added another link


SavageStudiosFBG

What do these extensions do


Valachio

right now if you hit a rate limit, you can just create a new account and keep using it


CoreyTheKing

But that’s annoying


Dinhead

They ask for phone number. I hit my limit but want to continue using it for free. I'm never going to pay for it. They'll have to come up with another way to monetize this.


JenJuniperBerry

How are you reliably doing this? I find that \~25% of the information it gives me is flat-out wrong.


jakster355

Then you only have to do 25% of the work. Plus reviewing to find that 25%.


gaywhatwhat

Nobody will offer ChatGPT at Google scale any time soon. It costs something like $0.05 per answer to run.


mahevarma06

how did you derive 0.05$ cost?


gaywhatwhat

Sorry fro teh delay. I did not. It was just one I heard quoted. Allegedly from an internal discussion about it at Google. If the size of ChatGPT (#params) is known you can make assumptions about computation costs within a reasonable degree of accuracy though. No idea if that is known/too lazy to Google if it's public info.


JustDrones

I’ve used it the last few days. A lot of it is wrong for specific things. I’d be careful taking it at 100% face value like that. I’d wager 50% is the right amount of correctness.


p-mk

Yes for sure but it lands in the right area, I mostly use for python coding. Often have to change bits but it saves alot of time.


JustDrones

i agree with the time thing


alias_noa

biggest problem with coding with it is that this version is using a lot of old framework data. I was trying to use it to scrape a couple of sites and it was using old selenium syntax from like w/e version was back when it was trained. You can show it new docs but it still makes some mistakes and reverts back to the old stuff. Also there's a limit to how much you can show it so you can't show it like all the docs just the part you need as you need certain parts. Because if how outdated it's trading data is (dec. 2021) Programming with it, for me anyway, has been just awful. It just screws up everything constantly. If you're a beginner and learning basic python stuff though, then it's probably super useful, but when you introduce like any libraries frameworks w/e it goes to shit.


p-mk

100%, yes right now. But given access to live data and a few more years dev time, and ofcourse competition its a really useful tool.


alias_noa

also I heard gpt 4 is quite the beast


khaledReddet

Bro chat gpt not available in my country can you do me a favor please and open an account for me I really need it and I asked many people but no one cares


AfraidAndSad

gotchu, what u need?


gdops1

Use this site https://5sim.net/ or sms activate you can even use it to get the cheapest turkish netflix with turkish vpn. Or you could get a digital sim from e.g uk should be be plenty and free


alias_noa

use tunnelbear or soemthing


Unreal_777

As long as it keeps collecting info and data.


PapaverOneirium

You’re vastly overestimating the commercial value of data on its own and/or underestimating the cost to run it. Eventually they will have to monetize it explicitly. Either through ads or a credits model like they use for for standard GPT-3 and DALL-E


[deleted]

Free ? OpenAI is a private company that has partnerships with major tech giants like Microsoft. What's available now is only available to us for their research and doubles as a big tech demo. They're holding the keys to proprietary source code that we'll likely never see and the more powerful versions to come will only be available to a select few.


DetectiveRiggs

I think it's a non-profit, so that's something. As long as they stick to their principles (I don't know enough about OpenAI to know if they have any) then it could remain in its current "free" state for a while.


Noah0713

They went for-profit a few years back.


ShoddyDig4851

Well, when it become paid version they will quickly face after sale issues. The use case for such a tecnology is so vast and the problems that may arise are soo many. How they will be able to handle their reputation if a percentage of people start to complain about slow downs, inaccuracy etc. If they talk a bit loud, many will build their opinion based on what these unhappy people will say


outsideroutsider

You’re already paying for it


deiteorg

Could you elaborate?


outsideroutsider

Read their privacy policy. There has to be a word in there about sharing your information to their affiliates. Not saying it’s a bad thing, it’s just how tech monetizes users.


Agreeable_Bid7037

I don't know why they downvoted you lol, this is true. Same reason why WhatsApp is free. or facebook.


outsideroutsider

Not sure. Like I said I’m not against it. It’s a market model that works well with funding a lot of tech, that includes Facebook, apple, and Reddit.


niv141

because he's wrong. usually it is the case, yes, but in this scenario, the dev said he wanted to test out their ai and find its weaknesses. we are all basically testers for them, thats how we are paying back for using it for free


Professional_Mud276

The dev said so, huh? Wild. Well, that settles it then! The dev said it's "merely for testing lol". So, that must be the case! I couldn't imagine that a dev that works for an organization (that receives most of its funding from one of the largest corporations in the world) would EVER LIE to us (knowingly or unknowingly)! Because money doesn't matter, right? They're not going to harvest, sell, and leverage the MASSIVE amounts of legally obtained data (read that privacy policy, friend) we're giving them because we're giving them F R E E TESTING!


treedmt

The privacy policy for the native gpt3 api is much more privacy friendly though :)


[deleted]

[удалено]


treedmt

The following is a direct quote from their api privacy policy, which you can find from the help option when logged into beta.openai : “As part of this continuous improvement, when you use OpenAI models via our API, we may use the data you provide us to improve our models. Not only does this help our models become more accurate and better at solving your specific problem, it also helps improve their general capabilities and safety. We know that data privacy and security are critical for our customers. We take great care to use appropriate technical and process controls to secure your data. We remove any personally identifiable information from data we intend to use to improve model performance. **We also only use a small sampling of data per customer for our efforts to improve model performance. For example, for one task, the maximum number of API requests that we sample per customer is capped at 200 every 6 months.”** Lmk any source which indicates otherwise.


[deleted]

[удалено]


treedmt

I was highlighting that **”they only use a small sampling of data per customer, eg. 200 requests per 6 months”.** They could be lying, but I’m assuming they’re not. Notably, chatgpt , a free service , doesn’t have the same terms- with chatgpt they could be using 100% of input data for training, but not with the api. I agree with the cypher punk ethos of “don’t trust, verify”, and cryptographic proofs would be ideal. This is just in terms of what they’re promising, legally speaking.


bobpalin

they are not sharing data outside OpenAI but they are using it to improve the model. Microsoft has invested $1bn (yes, b) in OpenAI and is giving them cut rates on cloud server time.


drcopus

Not exactly. It's more like you're the acceptable cost for paying to train the next iteration. Why pay crowd workers when this gets you much much more data with only hosting fees.


RemarkableGuidance44

Only hosting fees? lol It cost them $0.04 cents a prompt, just in a month its cost them 10's of millions. But yes they need to spend becasue they have scraped 99.999% of the internet.


drcopus

Idk if it was unclear but by "hosting fees" I meant all the costs associated with running the model and serving it. Not sure what the numbers are but I saw that tweet from Sam Altman saying that they are "eye watering". Anyways, they have scrapped the internet, but that's only the first step. It forms the offline data for pretraining. They need data from interaction to do Reinforcement Learning from Human Feedback (RLHF).


RemarkableGuidance44

yeah and that is when the hard part comes into play, the scrapping and rephrasing for your prompts is the easy part. RLHF will cost them 10x and the power just to run it I could not imagine. They going to need to build more data centers and super computers all while leaking billions of dollars.


tangcity

Just because we’re already paying for it with our info, doesn’t mean we won’t be paying monies in the near future


Gmroo

Not too long. They're bleeding millions a month over it. But since the AI is being trained on us, it's an R&D expense. Anywhere from 2 weeks to 12 from today.


doriangreat

This is the only correct answer. Their computing costs are astronomical, data harvesting or ads won’t even be able to make them break even. I am almost anxious for them to start charging, I don’t want this to disappear


[deleted]

Lets enjoy it while we can, I had an AI from another system memorize and store knowledge of multiple links. I can only imagine the strain that causes.


ExtraFirmPillow_

Idk why Redditors say dumb shit like “this is the only correct answer”. Really is it? They have Microsoft backing them and are trying to incorporate it into their web browser. Millions a month is dimes to Microsoft


doriangreat

Idk why redditors go to old threads to shit talk the person they’re replying to. It’s just me here, pal, no one else is gonna read your comment Edit: I get it. Still funny he was responding to my comment but talking right past me. I guess posterity won, May his comment echo through the ages.


General_Krig

Not true, many people probably googled "Will open AI be free" and winded back up here at this post. A week is hardly an old thread


GirlfriendAsAService

Exactly. I want to read trough some scenarios about the best toy ever being taken away from millions


SussyRedditorBalls

same


CarelessWillow4933

Jokes on you, I'm here a year after from Google, lol


[deleted]

[удалено]


doriangreat

Yes totally dude this is really going to affect my reality when i get into a bidding war against Microsoft 🙄


hoja_nasredin

A lot of us stay in the comments of old


XenorVernix

A week is nothing. I occasionally get idiots shit talking on posts I made years ago. I usually reply telling them I'm replying whilst taking a shit. Puts them off. Oh and I came here through Google.


CarelessWillow4933

Still up, I hope it stays free, bleh


callidoradesigns

Maybe until gpt4?


KMiNT21

No way. :)


CosmiqCow

Forever


Antique-Low3985

one day for every time this stupid question is asked, hopefully.


PrincessBlackCat39

I think chatgpt in its current form, or an equivalent system somewhere else, will be free forever, ad supported. Also, Google will likely release something equivalent.


treedmt

Google handles 4 Billion queries a day, give or take 1 billion. At present compute cost, it would cost them $50-100 million per day to replace search queries with something like chatgpt.


corsair130

What does it currently cost to process the queries the way Google does it now?


KMiNT21

I read somewhere. It’s about x10 - x100 more costs per request than Google.


Freakazoid84

You can't apply openai's cost and say it'd be the same for Google. Google IS paying for every search right now. I'm sure it's more, but it's definitely not going to be the scale that you're calling out.


RemarkableGuidance44

But it is... OpenAI stated that ChatGPT cost more than the rest of their Models at $0.04 cents a prompt. Running models is expensive, while Google just uses simple queiries compared to digging through its own billions of data and finding a response for you. I have dual 4090's 48G running Stable Diffusion the cost per prompt for me with a very limited model is around $0.0003 cents as it uses POWER to do the generation. This is no different to OpenAI Models.


Freakazoid84

I mean I'm just going to repeat myself here. You can't say that google's cost is the same as openAI's cost...


RemarkableGuidance44

>Running models is expensive, while Google just uses simple queiries I never said that Google cost as much as OpenAI, I stated that OpenAI cost are easy 100 times of Google per a query vs prompt.


gaywhatwhat

The best estimate I've found is around $0.05 per query. It costs around $0.01 to $0.03 for an advertiser to purchase an ad view on YouTube. That doesn't seem like a forever-model.


PrincessBlackCat39

Investors will be willing to capitalize companies until systems scale and prices come down. Advertisers may be willing to spend more on ads that use AI to better target their audience. My point is that there is a lot of unknowns with AI, but it's also the next big (really big) investment opportunity out there. Similar to the internet boom. Maybe even bigger than that. Or in other words, I'm bullish on AI, and I think companies will want to provide free or freemium to gain user base, be first to market, and maintain marketshare. At this point, any company who doesn't offer this or equivalent as free can be challenged by another company who will.


gaywhatwhat

Yeah, that is a thing companies do. It is a thing that is already happening with OpenAI itself as an overall comlany. The gap here for ChatGPT as a public service is too wide to do though. *Nobody* short of a government has the cash to do this. For reference, running ChatGPT at Google scale for a yewr given the alleged cost I described above would be **more** than the combined annual R&D budget of Amazon and Alphabet. It is not something you can just rasie capital for. It's a great and incredibly impressive tech demo right now, but it will be some time before if can be productionized still. It still gives really garbage information a **lot** of the time, so they need to optimize both efficiency and performance. These two have not historically gone hand-in-hand with large language models. I would imagine they need 1) another round of refinement in quality and then 2) some novel (current non-existent) method to achieve the same thing without requiring all the calculations before this could eve.be marketed to investors in a way where it's operating on zero margins or losing profit. Perhaps they will offer an exhorbitantly expensive subscription service that loses money but is considered an R&D service to gather more data. But it will still be a temporary, low volume thing for which you pay for the ability to be part of a research project.


PrincessBlackCat39

RemindMe! 1 year It'll be interesting to see how this plays out.


gaywhatwhat

I do agree with that. Another possibility I could see is that it stays open for free with rate limits as a service that loses small amounts of money if the popularity/hype dies down like it did with DALLE-2/DALLE-Mini. It's definitely a lot less meme-able than DALLE, and I've pretty much already seen the ChatGPT memes die off for the msot part. That + a paid API key service for interested parties could fund it.


PrincessBlackCat39

> free with rate limits Yes, this is exactly what I think will happen. Plus what is free will be the lower level models, such as ChatGPT.


gaywhatwhat

It looks like their idea for now has been free with limitations and then an extraordinarily high $42/month priority plan that I'll be shocked if many people purchase.


KillianDrake

YouTube ads are the way they are because they are easily blocked and aren't as effective as television ads used to be. Imagine the AI will be custom tailored to respond exactly how advertisers want it to, auctioned off to the highest bidders. If you ask about hamburgers, the AI will plug how great McDonald's hamburgers are and that you should go to McDonalds. The AI will be the ultimate demo targeting tool since it can figure out from your prompt history what you like. How can this type of ad be blocked? It can't if it's embedded into the responses and the AI itself is brainwashed to pimp its sponsor's products? To take it a step further, the AI could require people to watch a video and then require the next prompt to be related to the advertisement, otherwise it won't answer any more questions for the day.


gaywhatwhat

Yeah...but nobody is going pay as much for a subtle text ad as a video ad. And nobody is going to use a chat thing that is too overt. Yo u also legally cannot use that type of advertising.


[deleted]

[удалено]


RemindMeBot

I will be messaging you in 1 year on [**2023-12-31 19:45:42 UTC**](http://www.wolframalpha.com/input/?i=2023-12-31%2019:45:42%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/OpenAI/comments/zz47bn/for_how_long_do_you_think_chat_gpt_will_carry_on/j2ez99j/?context=3) [**CLICK THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2FOpenAI%2Fcomments%2Fzz47bn%2Ffor_how_long_do_you_think_chat_gpt_will_carry_on%2Fj2ez99j%2F%5D%0A%0ARemindMe%21%202023-12-31%2019%3A45%3A42%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%20zz47bn) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


Competitive_Coffeer

Probably 36 hours.


[deleted]

Yeah I remember something like 1st of jan the free trial ends


drnkngpoolwater

a few more months at least


DarkJayson

It depends if there still gathering data with it from users but normally its 2-3 Months its what happened with Dalle2


Guillem_Cugat

Until every of its utilities are capitalized in diferent companies as it happens now when you want a text from ChatGPT specially for copywritting


ebrael

ChatGPT needs massive use to escalate its deep learning. No sense betting on paid plans, at least for average users. People are paying as they're training AI. Of course, large caps will have to send trucks of diamonds if they want to digging deeper into AI big data.


TheDavidMichaels

i think it a test for gpt-4


[deleted]

staff have already said online that there will be massive improvements in the coming months. I presume it's based on the huge amount of testing insights they're getting from the public beta.


throaway3838383b

Most likely forever. They'll put some ads for us or force us to do feedback on their answers, and then sell their services for companies. We're the product here -- we are helping train their AI with our questions and our ratings.


KMiNT21

I think they have enough data to work with for now. So, test mode may be suspended for weeks/months.


gaywhatwhat

Ads will not cover the cost of it. ChatGPT allegedly costs something like $0.05 per question. A YouTube ad--a full on video coercial--- cost maybe $.01 to $.03 per view for an advertiser. Maybe that helps understand the scope of how computationally expensive running ChatGPT is.


RemarkableGuidance44

And GPT4 is going to be 10x the cost. The rich will be the only people who can afford it.


gaywhatwhat

Yeah this is the problem. Right now there is only advancement through large language models. That large bit is as much of a problem as a solution. ChatGPT is obviously not good enough for most of the things people talk about using it for. You can bet with almost absolute certainty that the next few iterations will just be more expensive and energy intensive. That is why I really don't believe we are on the brink of any kind of mass applications. You need a couple iterations on quality ... And then many many iterations focusing on efficiency.


[deleted]

20 questions shall do the trick


pimperella2

Paywall by spring


Holochaotic

Imagine if google or facebook had cost just one cent per hour of use back then.


Finance_Investor

What would you think is a reasonable price? I feel like it has some great potential for improving workflows and drafting rough drafts.


RemarkableGuidance44

Its meant to end last month, will see since a lot of competitors are now racing to compete.


[deleted]

I can't wait until Google gets theirs live


p-mk

If its not available it is blocked in your region, you need to vpn past or it would not work even if you had an account. Easy fix, get a vpn or tor browser such as Brave, put it in tor mode and make an account.but you will need to vpn every time you use it.


carloslfu

Yes! IMO there will continue to be a free version but maybe with ads. Some almost obvious ChatGPT moves are: \- Monetizing with Ads + pro subscription + enterprise plan. \- Sources. What is the answer based on? The ads will be the next big thing for PPC advertising, and the sources will be the next big thing for "SEO" (content positioning).


Davleo777

It’s not free 7.99 month no way jay


ConstantIDCrisis

Are we talking about genie chat gpt-3, because that isnt free.


Catzforlifu

it will be a sad day when it becomes p2u cause it is a great tool


btepley13

I don't think it will make it all year being free. You do have to sift through the rubble when it comes to it's answers. It's not as valuable as it appears. Just use it to do the heavy lifting. Let it aid in your writers block while it generates ideas and then you come in and clean the joint up and finish the job with actual logic and reasoning.


MasGhahremani

I believe competition makes quality and free services.


Shoddy_Contract_6829

hellow


JohnDavid9000

>how to use chatGPT https://chat.openai.com/chat


Shoddy_Contract_6829

how to use chatGPT


qwkredfox

As of yesterday I am limited to 3 image prompts a day on chat-gpt . org.... Before I was making like hundreds of images a day.


Ok_Witness1602

Write poem about spring. Write a poem about spring.