this post was submitted on 28 Jun 2023
7 points (100.0% liked)
Socialism
3034 readers
1 users here now
Beehaw's community for socialists, communists, anarchists, and non-authoritarian leftists (this means anti-capitalists) of all stripes. A place for all leftist and labor news and discussion, as long as you're nice about it.
Non-socialists are welcome to come to learn, though it's hard to get to in-depth discussions if the community is constantly fighting over the basics. We ask that non-socialists please be respectful and try not to turn this into a "left vs right" debate forum by asking leading questions or by trying to draw others into a fight.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
First, thanks for modding this hive @OneRedFox, second thanks for posting this.
With as quickly as technology moves and the myriad of reasons new technology is pushed/adopted, it’s all too easy to overlook immoral and unethical behavior. This article definitely touches on many things that people should be wary of regarding the use (and blind trust) of AI technology. Though only anecdotal, it unfortunately seems like most people that have the influence to push meaningful discussion about subjects like this in the larger social arenas are too occupied with how “shiny” AI technology is.
That being said, I’m going on record now and stating that I’ll be fighting for the resistance against Skynet. 🍁
Yeah, computers are not flawless and AI is no exception. It's also bound by the regular biases that could be introduced into the system, either directly through the programming, or through the datasets used for training. I recall a few years ago that Google's image recognition technology was mistakenly identifying black people as gorillas because they didn't test their tech on enough racial minorities. Computers and AI can be useful tools, but people need to keep such in mind when using them.