this post was submitted on 23 Nov 2023
956 points (95.3% liked)

Technology

59675 readers
3479 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Bill Gates says a 3-day work week where 'machines can make all the food and stuff' isn't a bad idea::"A society where you only have to work three days a week, that's probably OK," Bill Gates said.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 year ago (1 children)

I think the idea would be to have machines replace people wherever possible and then have multiple people split the work time where it isn't. Why does one farmer have to work 24/7 if two could split the work and actually have a life outside of work?

[–] [email protected] 4 points 1 year ago (2 children)

I think ultimately this is going to become the crunch point. Because what kind of jobs can AI eventually take over (with appropriate robotics) in the mid-term future?

  • Driving (if all cars were computer controlled today and roads were segregated from pedestrians, it'd probably already be possible)
  • Likely end to end delivery could be automated. Large amounts of the process already are
  • Train (and bus based on item 1) drivers. Currently, much of the urban transit systems around the world are ATO, where the train controller opens/closes doors and starts the train and is primarily present for safety. The rest is done automatically. There are already fully automated transits, and I suspect it is unions and legitimate safety concerns stopping full automation. But, it could be done with some work I think.
  • Software development. I mean, currently the AI prediction in Visual Studio is sometimes scarily good. It DOES need to be guided by someone that can recognise when it gets it wrong. But so often development of a function now is writing 2 lines and auto completing half of the rest of the lines from the "AI". It's really a task of improving LLM and tying in LLM to product specific knowledge. Our days are most certainly numbered I think.
  • Software design. This is similar to the above. With a good LLM (or General AI) loaded with good product knowledge, you might only need a few people to maintain/rework requirements into a format they can work with and feed-back mistakes until they get a sensible result. Each time reducing the likelihood that mistake will happen again. We'll need less for sure.
  • I think a lot of the more basic functions of a nurse might well become tasks for some form of robotic AI companion for fully trained nurses/doctors. Maybe this is a bit further away
  • Airline pilots could probably already be replaced, and it's purely on the safety grounds that I'm glad they're not. Generally once a route is programmed the pilots on a flight that goes well, will drive the plane to the runway, the plane will automatically set thrust for economic take-off. Once established in the air autopilot will pretty much take them to their destination. Pilots can then switch modes, and the autopilot for an equipped airport can take the plane to a safe landing. Although in practice, pilots usually take control back around 500 feet from the ground, I think. It's not really many steps that need automating. I feel like, at least one pilot will be retained for safety reasons. For the reasons for certain high profile incidents, there's an argument to keep 2 forever. But, in terms of could they be replaced? Yes, totally.
  • Salespersons. Honestly, the way algorithms trick people into buying things they don't need. I'd argue they've already been replaced and businesses just still employ real sales people because they feel they need to :P
  • Cleaners (domestic and street/commercial) could potentially be replaced by robotic versions. At the very least, the number of real people needed could be drastically reduced to supervisors of a robotic team.
  • Retail workers. There's already the automated McDonald's isn't there? I also think the fact commercial property in large cities is becoming less occupied is a sign that as a whole, we're moving away from high-street retail and more online or specialist. As such, while we'll always probably need some real people here, the numbers will be much lower.

Now, when it comes to industrial and farm work. There's a LOT that is already semi-automated. One person can do the job with tech that might have taken 10 or more now. I can see this improving and if we ever pull of a more generalised AI approach, more entire roles could be eliminated.

My main point is, we're already at the point where the number of jobs that need people are considerably less than they used to be, this trend will continue. We know we cannot trust the free market and business in general to be ethical about this. So we should expect a large surplus of people with no real chance of gainful employment.

How we deal with that is important. Do we keep capitalism and go with a UBI and allow people to pursue their passions to top that up? Do we have some kind of inverse lottery for the jobs that do need doing? Where people perhaps take a 3 month block of 3 day working weeks to fill some of the positions that are needed? I'm not sure. I suspect we're going to go through at least a short period of "dark age" where the rich get MUCH richer, and everyone else gets screwed over before something is done about the problem.

Looks to me like Gates is looking ahead at this.

Sorry if that wall of text sounds pessimistic. Just one way I can see things going.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Honestly 10 to 1 is a low estimate. It's an absurd number like 100 or even 200 to 1 from what it once was with the right equipment.

[–] [email protected] 2 points 1 year ago

I think it varies by industry/job position. It was a number out of thin air though, I'll admit.

[–] [email protected] 2 points 1 year ago (1 children)

We know we cannot trust the free market and business in general to be ethical about this.

Disagree to that.

I say, you can trust the markets and businesses to always act as unethical as possible. And with 'possible' I mean a lot worse than legally possible.

[–] [email protected] 0 points 1 year ago (1 children)

I don't really see organisations as unethical. They usually don't act ethically, but that's not because as a whole they're unethical.

I see them more like insects. They generally react to stimuli and just do the same as the other insects/organisations, things that have been proven to work. They're also generally driven by one basic instinct, to make more money, and they do it at any cost. The drones (employees) are entirely disposable in this endeavour and if they can entirely remove them from the equation they will do it in a heartbeat.

Even those that perhaps do have some form of ethical streak and don't think they should dump all their employees for AI/robots? Well, good for them, but they'll be driven out of business by those that do.

When you think of a business or other organisation in this way, a lot of the weird things they do start to make a lot of sense.

[–] [email protected] 1 points 1 year ago (1 children)

make more money, and they do it at any cost.

That doesn't seem unethical to you??

'At any cost' usually means: by forgetting all kinds of laws and all kinds of ethics as well.

[–] [email protected] 0 points 1 year ago (1 children)

My point is, you don't see insects as ethical or unethical. I see organisations the same way. They're acting on instinct, and are just aiming to do what they exist for. Make money. Ethics don't even come into it. Now, peering outside in, you can try to cast society's ethical views on organisations. But, they generally don't even consider them (until they are forced to by local legislation, or that the route to making more money, or indeed not less money is to be seen to be ethical).

This is why there's more often than not a certain kind of person drawn to leadership positions.

[–] [email protected] 1 points 1 year ago (1 children)

You are saying that organisations don't need any ethics at all, but at the same time you refuse to call this "unethical".

For me this the point of EOD.

[–] [email protected] 0 points 1 year ago

Nope. I think you're not really understanding what I'm trying to say. I'm saying that ethics do not factor into an organisation's decisions in the same way it doesn't for a colony of insects. They are ethically neutral in that respect.

At the same time, if you apply ethics looking from the outside in, of course you will cast their actions as ethical and unethical and many of their actions will be unethical.

I'm actually saying this is a bad thing, but is just a property of how an organisation, and especially successful businesses, operate. We're not going to change that, I suspect. As such we should expect businesses to exploit AI to the fullest ability, even knowing that removing most or all of their employees is bad for the employee, bad for the country (and the world), bad for the economy and ultimately in the future, bad for the business/organisation too. But they simply do not look that far ahead.