"...37%.... That means nearly one in four..."
Eh, no it doesn't, it means nearly two in five. Which is worse.
"...37%.... That means nearly one in four..."
Eh, no it doesn't, it means nearly two in five. Which is worse.
Jason Kint writes a thread on how Google spun - and publications printed their spin - on a recently lost case: https://xcancel.com/jason_kint/status/1836781623137681746
If you already are very cynical about tech journalism (or the state of journalism in general), it might be nothing new except confirmation from the internal documents of Google. But always nice to see how the sausages are made.
Isn't this just Snow Crash again? Can't these techbros read another book, we already have the Meta verse and it wasn't that popular in reality.
That's the one, thank you.
German pilot, and crash in France, not French pilot. Second pilot locking out the captain, not the other way around. Otherwise my memory seem to have served.
This was so stupid.
A hijacking happens when passengers overflow into the cockpit from the cabin.
Oh no! A little kid has been invited to have a look! Passenger overflow! Hijacking!
His attempt at solution isn't as cringe worthy, if one overlooks the reasoning. Separating the cabin from the pilots is a way of preventing hijacking that has been attempted, but it has problems. Notably if the pilots get acute medical emergency or indeed if the pilot steer the plane into the ground.
Some ten years ago a french pilot locked out his second and ran the plane into the ground. For increased safety the after 911 the door to the cabin could only be opened from the inside.
The picture at top of the blog post is Sam Altman, the guy saving the world from the AI he is creating.
And yeah, he looks like that.
Tiired of picking out gifts the recipient don't need or want? Here at Giftr we have partnered with Amazon to bring you Shit-as-a-service!
Just prompt our chat-bot and we will send a gift to a statistically similar address!
If it never arrives? Well, they didn't want it anyway! Giftr!
I think it is a combination of:
Ok, point on language.
But I thought LLMs were machine learning, or rather a particular application of it? Have I misunderstood that? Isn't it all black boxed matrixes of statistical connections?
I have so far seen two working AI applications that actually makes sense, both in a hospital setting:
These two are nifty, but it doesn't make a multi billion dollar industry.
In other words the bubble is bursting and the value / waste ratio looks extremely low.
Say what you want about the Tulip bubble, but at least tulips are pretty.
Here it sounds like he is criticising the parliamentary system were the legislative elects the executive instead of direct election of the executive. Of course both in parliamentary and presidential (and combined) systems a number of voting systems are used. The US famously does not use FPTP for presidential elections, but instead uses an electoral college.
So to be very charitable, he means a parliamentary system where it's hard to depose the executive. I don't think any parliamentary system uses 60 % (presumably of votes or seats in parliament) to depose a cabinet leader, mostly because once you have 50% aligned the cabinet leader you presumably have an opposition leader with a potential majority. So 60% is stupid.
If you want a combined system where parliament appoints but can't depose, Suriname is the place to be. Though of course they appoint their president for a term, not indefinitely. Because that's stupid.
To sum up: stupid ideas, expressed unclearly. Maybe he should have gone to high school.
Repeat until a machine that can create God is built. Then it's God's problem.
But it must be a US God, otherwise China wins.