The idea of NN or the basis itself is not AI. If you had actual read D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning Internal Representations by Error Propagation.” Sep. 01, 1985. then you would understand this bc that paper is about a machine learning technique not AI. If you had done your research properly instead of just reading wikipedia, then you would have also come across autoassociative memory which is the precursor to autoencoders and generative autoencoders which is the foundation of a lot of what we now think of as AI models. H. Abdi, “A Generalized Approach For Connectionist Auto-Associative Memories: Interpretation, Implication Illustration For Face Processing,” in In J. Demongeot (Ed.) Artificial, University Press, 1988, pp. 151–164.
Over-enthusiatic english teachers... and skynet (cue dramatic music)
Hey, don't go giving Big Brother any good ideas now (especially ones that would work on me)
Not the specific models unless I've been missing out on some key papers. The 90s models were a lot smaller. A "deep" NN used to be 3 or more layers and that's nothing today. Data is a huge component too
So I'm a reasearcher in this field and you're not wrong, there is a load of hype. So the area that's been getting the most attention lately is specifically generative machine learning techniques. The techniques are not exactly new (some date back to the 80s/90s) and they aren't actually that good at learning. By that I mean they need a lot of data and computation time to get good results. Two things that have gotten easier to access recently. However, it isn't always a requirement to have such a complex system. Even Eliza, a chatbot was made back in 1966 has suprising similar to the responses of some therapy chatbots today without using any machine learning. You should try it and see for yourself, I've seen people fooled by it and the code is really simple. Also people think things like Kalman filters are "smart" but it's just straightforward math so I guess the conclusion is people have biased opinions.
That's true, also at some point the human will go "that's too much work, I'm not going to answer that" but the ai will always try to give you it's best response. Like I could look up the unicode characters you're using but I'd never actually take the time to do that
All these models are really terrible at following conversations even chatgpt, I can only get it to reliably remember about 2 responses. If I can't get what I want in two then I need to restate info or edit the past prompts.
Generally, very short term memory span so have longer conversations as in more messages. Inability to recognize concepts/nonsense. Hardcoded safeguards. Extremely consistent (typically correct) writing style. The use of the Oxford comma always makes me suspicious ;)
I'm kind of against putting tracking chips in anything I might eat
There's more bar soaps than ever. You can even get special hair shampoo and conditioner bars or shaving cream as a bar
You mean the plastic ones or the real ones? The plastic ones you can hand wash but I wouldn't stick it in a washing machine (you probably could if you used a laundry bag and put it on low spin)
I'm surprised no one has said london drugs https://www.londondrugs.com/about-london-drugs/about-us.html that is the closest to amazon imo. shoppersdrugmart.ca and fortinos.ca are owned by Loblaws and have an expanded marketplace selection.
Costco.ca is an american company but publicly traded. Well.ca used to be canadian but is now owned by McKesson Corporation which is American and publicly traded.
You could try to buy through etsy or look for canadian shopify stores like bee kinds https://beekindwraps.ca/ ( here is a complete list ) which are more specific sites. You can often find the same products on marketplace sites but the merchant will pay less fees if you go through their own site so try searching products too