Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The bleeding obvious is that OpenAI is doing what most tech companies for the last 20 years have done. Offer the product for dirt cheap to kill off competition, then extract as much value from your users as possible by either mining data or hiking the price.

I don’t understand how people are surprised by this anymore.

So yeah, it’s the best option right now, when the company is burning through cash, but they’re planning on getting that money back from you eventually.



> Offer the product for dirt cheap to kill off competition, then extract as much value from your users as possible by either mining data or hiking the price.

Genuine question, what are some examples of companies in that "hiking the price" camp?

I can think of tons of tech companies that sold or sell stuff at a loss for growth, but struggling to find examples where the companies then are able to turn dominant market share into higher prices.

To be clear, I'm definitely not implying they are not out there, just looking for examples.


Uber is probably the biggest pure example. When I was in uni when they first spread, Uber's entire business model was flood the market with hilariously low prices and steep discounts. People overnight started using them like crazy. They were practically giving away their product. Now, they're as expensive, if not sometimes more expensive, than any other taxi or ridesharing service in my area.

One thing I'll add is that it's not always that this ends with higher prices in an absolute sense, but that the tech company is able to essentially cut the knees out of their competitors until they're a shell of their former selves. Then when the prices go "up", they're in a way a return to the "norm", only they have a larger and dominant market share because of their crazy pricing in the early stages.


Yeah I kinda wonder why people even use them anymore. I've long gone back to real taxis because their cheaper and I don't have to book them, I can just grab one on the street. Much more efficient than waiting for slowly watching my driver edge his way to me from 3 kilometers away.


The number of places where you can reliably walk out onto the street and hail a taxi is pretty small. Everywhere else, the relevant decision is whether calling a dispatcher or using a taxi company's app is faster/cheaper/more reliable than Uber/Lyft.


Living in the Detroit Metro area seeing that interaction in movies always blew my mind. Just walk outside and get a taxi? Crazy talk.


Here in Barcelona it works like that. 2-3 free taxis pass my place per minute or so. Just wave and you've got one. They're also very cheap.

But yeah the movie thing is not at all unrealistic here. Though at night I usually wave the torch on my phone to attract attention because they don't always see a raised hand.


Here in Barcelona it's great, I really never have to call one. It's always faster just waiting.

At the busiest time it's a bit harder but at that time the ride-sharing services are also overloaded so it's still faster to just wait for a green light (free taxi). We don't get Uber as far as I know but we do have a similar thing called Cabify. But it's useless if you need something quick and they've put the prices up too much. I now only use them for scheduled stuff like airport dropoffs.


I think the cab companies here have apps. I don't know how good they are, though I've been meaning to find out.

There's no way I want to ever wait on hold for a dispatcher again, or be mystified as to when my cab is arriving (this always seems to involve standing outside in the snow or pouring rain).

If the cab companies have apps comparable to Uber and Lyft, sure, I'll give them a shot.


"Uber/Airbnb is expensive now" is an entirely american phenomena. In Europe and Latin America, both are still cheaper than alternative (comparable hotels and yellow cabs). Most likely in other parts of the world too.


The Google Maps API price hike of 2018 [1] is a relevant example.

[1] https://kobedigital.com/google-maps-api-changes


- Uber/Lyft increased prices significantly (and partially transition it into longer wait times) since they got into profitability mode

- Google is showing more and more ads over time to power high revenue growth YoY

- Unity has just tried to increase its prices


I think Google fits more in the "extract as much value from your users" bucket more than the price hiking one.

Uber/Lyft did raise prices, but interestingly (at least to me) is that if the strategy was the smother the competition with low prices, it didn't seem to work.

Unity is interesting too, though I'm not sure it would make a good poster child for this playbook. It raised prices but seems to be suffering for it.


Everyone's in "show your profits" mode, as befitting a mature market with smaller growth potential relative to the last few decades. Some of what we're talking about here is just what happens when a company tries to use investment capital to build a moat but fails (the Uber/Lyft issue you mentioned -- there's no obvious moat to ride-hailing, as with many software and app domains). My theory is that, going forward, we're going to see a much lower ceiling on revenue coupled with lots of competition in the market as VC investments cool off and companies can't spend their way into ephemeral market dominance.

As for Unity, they're certainly dealing with a bunch of underperforming PE and IPO-enabled M&A on the one hand (really should have considered that AppLovin offer, folks), but also just a failure to extract reasonable income from their flagship product on the other; I don't think their problems come from raising prices per se (game devs pay for a lot already, an engine fee is nothing new to them) as much as how they chose to do it and the original pricing model they tried to force on their clients. What they chose to do and the way they handled it wasn't just bad, it was "HBS case study bad."


Uber, Netflix and the online content streaming services. These are probably the most prominent examples from this recent 2010s era.


Unity. Atlassian. Jetbrains. Elastic.


I was most thinking of Uber and AirBnB for price hiking and Google and Facebook for extracting value.


VMWare, Docker.


OpenAI doesn’t own transformers, they didn’t even invent them. They just have the best one at this particular time. They have no moat.

At some point, someone else will make a competitive model, if it’s Facebook then it might even be open source, and the industry will see price competition downwards.


This argument has always felt to me like saying “google has no moat in search, they just happen to currently have the best page rank. Nothing is stopping yahoo from creating a better one”


Google has a flywheel where its dominant position in search results in more users, whose data refines the search algorithm over time. The question is whether OpenAI has a similar thing going, or whether they just have done the best job of training a model against a static dataset so far. If they're able to incorporate customer usage to improve their models, that's a moat against competitors. If not, it's just a battle between groups of researchers and server farms to see who is best this week or next.


But that's exactly what they have: millions of high quality, rated chat interactions that no one else has.

I don't know how they could _not_ incorporate customer usage to improve their models.


well, this assumes the chat (where the ratings are given) is what people are using and paying for. I think most businesses pay for some combination of API access and specific use cases like code generation (at least, thats what I pay for) that don't really impact RLHF data. General search for consumers is likely to schism since chatGPT isn't especially different from Bard or Edge's AI assistant or the myriad of other product surface areas that can add it.


Yes the chat interactions don’t help with capability (what it can do) they only help with alignment (what it should do). And you don’t need a lot to get good results. Crowdsourcing will be enough.


It’s a different situation computationally. Transformers are asymmetric: hard to train but easy to run.

There is no such thing as an open source Google because Google’s value is in its vast data centers. Search is hard to train and hard to run.

GPT4 is not that big. It’s about 220B parameters, if you believe geohot, or perhaps more if you don’t.

One hard drive.


My understanding is that Google search is a lot more than just Pagerank (Map reduce for example). They had lots of heuristics, data, machine learning before anyone else etc.

Whereas the underlying algorithms behind all these GPTs so far are broadly same. Yes, OpenAI does probably have better data, model finetuning and other engineering techniques now, but I don't feel it's anything special that'll allow themselves to differentiate themselves from competitors in the long run.

(If the data collected from a current LLM user in improving model proves very valuable, that's different. I personally think that's not the case now but who knows).


Google's moat in search has always been systems and data center infrastructure. You can create your own search ranking algorithm, but you can't crawl the web and serve search results to billions of worldwide users in a few milliseconds.


I think it's also more than just systems and data centers. it is also difficult to scrape the web the way Google does without using Google IP addresses. a lot of the web now will block you or severely throttle you if you aren't one of the well know engines that they want indexing them.


> You can create your own search ranking algorithm, but you can't crawl the web and serve search results to billions of worldwide users in a few milliseconds.

rephrasing this for LLMs instead of search: "you can create your own model architecture/training method, but you can't crawl the web and serve language query results to billions of worldwide users in a few milliseconds."

that checks out, right? Google/search == """Open"""AI/LLMs still seems like a decent metaphor to me.


> They just have the best one at this particular time

That is the moat. For developer platforms, it's all about building mindshare and adoption. The more people who know how to use OpenAI, the stronger OpenAI's position on the market. It doesn't matter if there's equivalent or slightly better models unless they start to fall significantly behind (and they're currently well in the lead).


I agree and what you say isn’t incompatible with what I said. But the point of the OP is “why even bother using other models/open source models when OpenAI is cheaper”? Well take away the competition and see what happens.


The difference between openai and next best model seems to be increasing and not decreasing. Maybe Google's gemini could be competitive, but I don't believe open source will match OpenAI's capability ever.

Also OpenAI gets significant discount on compute due to favourable deals from Nvidia and Microsoft. And they could design their server better for their homogenous needs. They are already working on AI chip.


Being ahead in a race doesn’t mean you’re going to win. Open source models will win eventually because they have the lowest marginal cost to run.

People will figure out what OpenAI is doing and duplicate it. There’s many people working at OpenAI, it’s going to leak out.


Did you even read my comment? I specifically highlighted why openai might be cheaper in long run. One is they are already working on a chip that would be better just for running a single model.


They are not going to beat NVIDIA. Making a chip for one model is not really a good idea, there are more efficiency gains to be made by improving the model and using a general purpose AI chip, rather than keeping the model architecture static and building a special purpose chip for it. Regardless, whatever OpenAI can do, NVIDIA can do better, and on more recent process nodes because they have the volume.


No, because NVidia has to work for all the models. Nvidia has other constraints that they need to have for users like instructions, security etc. which openai doesn't have.

e.g. As they have a fixed model which they know they would get billions of request to, they could even work with analogue chip which is significantly cheaper and faster for inference. [1] could achieve 10-100x flops/watt for fixed models compared to nvidia for their first gen chip.

[1]: https://www.nature.com/articles/s41586-023-06337-5


I agree with this.

However, what will be interesting is if the price of delivering ChatGPT-style experiences drops as the industry matures/advances, and their pricing moat erodes.

Unlike Uber - where prices are dictated by factors unlikely to move significantly (Labour / Vehicles / Fuel / etc), The LLM space doesn't have these types of overheads.


The real problem for the OP is not prices going up but open AI making something like the AI equivalent of Excel (an AI swiss army knife) such that middle companies are not needed for chatbots or talk to your pdf type apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: