This is an underrated take. Free markets don't work in information war zones.
The basis of the free market ideology is that rational actors with perfect information will result in competition that produces high-value trades.
AI based information warfare between buyers and sellers is going to have a Darwinian effect on marketplaces that evolves sellers that are primarily skilled in adversarial and highly targeted pricing tactics.
This is Delta's choice, though. There are still quite a few other airlines, for the "average" flight, and fliers still reserve the right to vote with their wallets if they perceive better value elsewhere, meaning Delta may have to adjust to that (depending on the risk they're willing to take/how accurate their personalized pricing models are).
I think they are talking about the technical definition of free markets and the mathematical result that they result in efficient pricing under some conditions (one of the conditions is perfect pricing information).
We aren’t in an ideal capitalist system in the theoretical sense, of course. It is just a model. But it is maybe worthwhile to keep tabs on how far we’re diverged from it…
Free markets might be in trouble. Capitalism might survive just fine, right? What is capitalism anyway (ima technical sense, as distinct from free markets). I think it is something about capital owning the means of production or something like that.
It's silly, but in the LLM world - "open source" is usually used to mean "weights are published". This is not to be confused with the software licensing meaning of "open source".
Very strange. I find reasoning has very narrow usefulness for me. It's great to get a project in context or to get oriented in the conversation, but on long conversations I find reasoning starts to add way too much extraneous stuff and get distracted from the task at hand.
I think my coding model ranking is something like Claude Code > Claude 4 raw > Gemini > big gap > o4-mini > o3
Time travel debugging on embedded ARM has been available for over 20 years via trace probes [1].
The category namer of time-travel debugging, TimeMachine, (hence time-travel debugging in contrast to other attempted names such as reversible, bidirectional, record-replay, etc.) was available in 2003 and supports/supported the ARM7 [2]. Note, that is not ARMv7 architecture, that is the ARM7 chip [3] in use from 1993-2001.
From what I know, the ARM7 was one of the first ARM designs implementing the Embedded Trace Macrocell (ETM) which could output the instruction and data trace data used to support trace probe-based time travel debugging.
What's limiting us is that Undo does need a Linux kernel - so traditional embedded programming wouldn't be a fit. Embedded Linux could work and we do support ARM64.
I've thought I bit about how you might support time travel on bare metal embedded - but actually there are hardware-assisted solutions (Lauterbach's Trace32 was one we came across) there sometimes.
Great... we've been complaining about useless AI being forced upon us in the commercial sector and now YC alums are going to implement it in veteran healthcare through a backdoor?
I've been pretty neutral about the whole thing to date, but it's starting to seem like YC's AI blindness is actually explained by true evil capitalist intentions.
They aren’t though. $5M is the cost of a single training run. $500B includes the cost of operations, data center, a lot more failed runs because they weren’t sure that they’d were on the right path etc.
Right. It's like building a large model rocket and saying that you've cracked rocketry for a fraction of the cost that was required in the 1940's and 1950's.
Well yes, yes you did, because all you had to do was follow the existing instructions, guidelines, and use easily available materials. You didn't go down any dead ends, didn't have to work your way from propellants like high test peroxide, dangerous hypergolics, and eventually develop solid rocket boosters.
It's like making the generic of a drug someone else developed.
I don’t think anyone is claiming that Deepseek didn’t produce a very impressive frontier model. They’re just saying it’s not surprising that, in your analogy, flipping the single bit was cheaper than the prior work.
No, just that what DeepSeek did is not as valuable as what the first movers did, because it did not advance the state of the art nearly as much. It's a new cheaper way to go from Base LM -> CoT "reasoning" LM. We already had CoT "reasoning" LMs, so while the new cheaper path to get to them is interesting, it's not necessarily groundbreaking either. Also, R1 only works with the "cold start" data that they distilled from o1, so it's not quite clear that it'll ever be able to exceed o1's capabilities. We already know it's much cheaper to distill new, smaller models from large already pretained and well-performing models—in fact, $5M sounds like a very expensive way to do so. So while these new techniques are probably going to have some impact, OpenAI is far from quaking in their boots
Um it’s very valuable, maybe even more valuable… Companies can now have a private LLM with o1 quality without having to send data to OpenAI or pay for their API.
Probably just that it's not as impressive as it appears because it didn't innovate. Which is of course irrelevant since the innovative leap here were the optimizations that let them make their model with an order of magnitude fewer materials, regardless of whatever innovation costs OpenAI ate.
> The next big thing was Cursor. I must admit that I never personally fell in love with it, but given how many people I respect love it, I think that’s a me-problem
I've met so many engineers who have said exactly this. There are clearly some group of people obsessed with Cursor, but it's interesting to me how alien they seem to the majority of people using ai codegen right now.
i had to uninstall it because it had associated itself with every possible file extension. i couldn't open a file without cursor popping up. very horrifying for that to happen to my computer when working on important projects
On Linux, I have the opposite issue. I ended up hard symlinking cursor to VS Code because Cursor wasn't opening despite being set as the default editor.
I prefer vscode+copilot. It's much cheaper and has all the functionality I want. There's access to 3.5 sonnet, and it can edit/create up to 10 files at a time.
I'm pretty much full time on cursor from vscode. I don't trust it for big code blocks, but a control-k + reasonable command (I could have typed up myself) is saving me quite a bit of time.