MartianSands

joined 2 years ago
[–] MartianSands 0 points 1 week ago

That's not obviously the case. I don't think anyone has a sufficient understanding of general AI, or of consciousness, to say with any confidence what is or is not relevant.

We can agree that LLMs are not going to be turned into general AI though

[–] MartianSands 2 points 2 weeks ago (1 children)

You're still putting words in my mouth.

I never said they weren't stealing the data

I didn't comment on that at all, because it's not relevant to the point I was actually making, which is that people treating the output of an LLM as if it were derived from any factual source at all is really problematic, because it isn't.

[–] MartianSands 1 points 2 weeks ago (3 children)

You're putting words in my mouth, and inventing arguments I never made.

I didn't say anything about whether the training data is stolen or not. I also didn't say a single word about intelligence, or originality.

I haven't been tricked into using one piece of language over another, I'm a software engineer and know enough about how these systems actually work to reach my own conclusions.

There is not a database tucked away in the LLM anywhere which you could search through and find the phrases which it was trained on, it simply doesn't exist.

That isn't to say it's completely impossible for an LLM to spit out something which formed part of the training data, but it's pretty rare. 99% of what it generates doesn't come from anywhere in particular, and you wouldn't find it in any of the sources which were fed to the model in training.

[–] MartianSands 10 points 2 weeks ago (5 children)

That simply isn't true. There's nothing in common between an LLM and a search engine, except insofar as the people developing the LLM had access to search engines, and may have used them during their data gathering efforts for training data

[–] MartianSands 29 points 2 weeks ago (10 children)

Except these AI systems aren't search engines, and people treating them like they are is really dangerous

[–] MartianSands 9 points 2 weeks ago (1 children)

It's probably not just software. I'd expect it to be an entirely separate set of processors which track how the flight is going and do absolutely nothing except decide whether or not to terminate

[–] MartianSands 8 points 1 month ago

I couldn't find the actual pinout for the 8 pin package, but the block diagrams make me think they're power, ground, and 6 general purpose pins which can all be GPIO. Other functions, like ADC, SPI and I2C (all of which it has) will be secondary or tertiary functions on those same pins, selected in software.

So the actual answer you're looking for is basically that all of the pins are everything, and the pinout is almost entirely software defined

[–] MartianSands 9 points 1 month ago (1 children)

BGA, like in the photo, isn't the only option. There are options only slightly larger with hand-solderable packages (if you're good at soldering)

[–] MartianSands 1 points 1 month ago

"shortest route" and "straight line" actually mean pretty much the same thing. The shortest route is the straight line. Sorry if I confused the matter by switching up the terminology.

Flying parallel to the lines of latitude would mean that your bearing doesn't change much, sure, but flying in a straight line would require your heading to change continuously.

The aircraft in the screenshot was flying a very not-straight course

[–] MartianSands 2 points 1 month ago (2 children)

That's misleading. The shortest route would be the "great circular" joining the two points, which lines of latitude definitely are not.

The only line of latitude which is a great circle is the equator.

[–] MartianSands 15 points 1 month ago (1 children)

No, it's not. It's noth of the equator, so the straight line route would look like a curve towards the north. This route is curved south, which means it's actually because of air traffic control routing them along approved flight paths. That might be for traffic management reasons, or because of terrain on the route, or restricted airspace.

[–] MartianSands 4 points 1 month ago

Even that won't be truly effective. It's all marketing, at this point.

The problem of hallucination really is fundamental to the technology. If there's a way to prevent it, it won't be as simple as training it differently

view more: next ›