I know at least one person like that. They won't outright avoid prop planes, and they know its illogical, but the idea of flying on one still makes them nervous.
PlzGivHugs
I believe Jack has done a bit for his anti-freebooting campaign, but to my knowledge, it hasn't been much and this is by-far the most formal.
From my understanding, they did land at high-tide (to allow ships to get closer) but in pictures, its still quite the distance. I guess because the beach is shallow, they still had to land a distance out?
They had some idea, although it was less certain than seen in the context of Saving Private Ryan.
First of all, there was efforts to weaken the defenses. Both bombardment from naval ships and dropped bombs from planes were meant to significantly soften the defenses for the landing. According to the plans, this should have significantly reduced the defenses. In reality, the naval bombardment was nowhere near large enough, and the bombers missed their targets due to bad weather. This was only discovered as they reached the beach.
Once the infantry was landing, there was also supposed to be quite a bit more support for them. Specialized amphibious tanks were created, and meant to be driven up onto the beaches to provide cover for the infantry. This almost immediately went ary as the rough water swamped or sunk dozens of the initial tanks, and lead to the use of landing craft for the remaining tanks, slowing down their deployment. Even of those that were launched by landing craft, many were lost.
Also worth noting is that the Normandy beaches, don't actually look that much like the movie. When they land in Saving Private Ryan, it looks like they're only 20 meters or so from the cliff. In actuality, it was much, much further. If you look at the famous photo from the landing, Into the Jaws of Death by Robert F. Sargent, if gives you an idea of what the beach actually looks like and the conditions on the morning - visibility was low, and they were likely a hundred meters or more from the cliff face. Less likely to get shot the moment the gates open then how it looks in the movie, although horrifically even worse in nearly every other way.
If you're really interested in more detail, TimeGhost has an excellent documentary (split into pieces to make it watchable as a series) on the subject on their D-Day 24hrs channel that covers the background, the events of the day, and the details and context surrounding it in extreme detail. That said, its a multi-day watch, given that its 24 hours long.
I'm assumimg you have a store key.
You could post to one of the relevant communities for game giveaaways, such as !freegames@feddit.uk or !Randomactsofgaming@lemm.ee
Standard practice is to pretty much just say what game you're offering, and the dm the key to a random respondent after a day or two.
The only one that comes to mind is Club Penguin's coffee shop. That one is burned into my memory.
Is it just me being set in my ways, or does this look terrible? It seems like its going to make it harder to use URLs and clutter up what was previously clean, functional UI just to highlight rarely-used commands.
Edit: Also isn't hiding the url a security issue? How else do you recognize phishing sites?
My point of contention is that the arguments you're using are flawed, not your intentions. OpenAI, Meta, Disney, ect. are in the wrong because they pirate/freeboot and infringement on independent artist's licenses. It's not their use of technology or the derivative nature of the works it produces that are the problem: making AI the face of the issues moves the blame away from the companies, and allows them to continue to pirate/freeboot/plagiarize (or steal, as you define it) from artists.
Yes, part of my point is that capitalism is bad, but thats further up the chain than what I was arguing. My point is that copyright law and more importantly, its implementation and enforcement is broken. Basically all your issues originate not with AI but with the fact that independent artists have no recourse when their copyrights are violated. AI wouldn't be an issue if AI compananies actually paid artists for their work, and artists could sue companies who infringe on their rights. The problem is that artists are being exploited and have no recourse.
Using allegory to hopefully make my point a bit more clear: Imagine you have a shop of weavers (artists). The comapny running the shop brings in a loom (AI), and starts chaining their workers to it and claiming its an Automatic Weaver™ (pirating and violating artists rights). The problem isn't the loom, and blaming it shifts blame away from whoever it was that decided to enslave their workers. Trying to ban the loom doesn't prevent the shop from just chaining the workers to their desks, as was often done in the past, nor does it prevent them from bringing in Automatic Potters™. If you want to stop this, even ignoring the larger spectre of capitalism, it should be slavery that is outlawed (already done) and punished (not done), not the use of looms.
If you are trying to fix/stop the current state of AI and prevent artists from being exploited by massive companies in this way, banning AI will only slow it and will limit potentially useful technology (that artists should be paid for). Rather than tackle one of the end results of rhe problem, you need to target it closer to its root - the fact that large companies can freely pirate, freeboot, and plagiarize smaller artists.
It isn't current AI voice tech that was an issue. It was the potential for future AI they were worried about. AI voices as they are now, are of similar quality to pulling someone off the street and putting them in front of a mid-range mic. If you care about quality at all, (without massive changes to how AI tech functions) you'll always need a human.
And to be clear, what about AI makes it the problem, rather than copyright? If I can use a voice synthesizer to replicate an actors voice, why is that fine and AI not? Should it not be that reproduction of an actor's voice is right or wrong based on why its done and its implications rather than because of the technology used to replicate it?
Edit: And to be clear, just because a company can use it as an excuse to lower wages, doesn't mean its a viable alternative to hiring workers. Claims that they could replace their workers with AI is just the usual capitalist bullshit excuses to exploit their workers.
Big movie studios will use it to generate parts (and eventually all) of a movie. They can use this as leverage to pay the artists less and hire fewer of them. Animators, actors, voice actors.
Only if its profitable, and given that AI output is inherently very limited, it won't be. AI can only produce lower quality, derivative works. In isolation, some works might not be easy to distinguish, but thats only on a small scale and in isolation.
If a movie studio pirated work and used it in a film, that's against copyright and we could sue them under current law.
But if they are paying openAI for a service, and it uses copyrighted material, since openAI did the stealing and not the studio then it's not clear if we can sue the studio.
You can sue the studio. In the same way, you would sue the studio if an artist working there (or even someone directing artists) creates something the violates copyright, even by accedent. If they publish a work that infringes on copyright, you can sue them.
Seems like it's being argued that because of the layer of abstraction that is created when large quantities of media is used, rather than an individual's work, that it's suddenly a victimless crime.
By that logic, anything that takes inspiration, no matter now broad, or uses anothers work in any way, no matter how transformative, should be prevented from making their own work. That is my point. AI is just an algorithm to take thousands of images and blends them together. It isn't evil, any more than a paint brush is. What is, is piracy for commercial use, and non-transformative copyright infringement. Both of these are already illegal, but artists can't do anything about it, not because companies haven't broken the law, but rather because an independent author trying to take, for example, Meta to court is going to bankrupt themselves.
Edit: Also notable in companies using/not using AI, is the fact that even transformative and """original""" AI work cannot be copyrighted. If Disney makes a movie thats largely AI, we can just share it freely without paying them.
I would say no, so long as its a case of, "This is looking for in a partner." rather than, "This is what I expect out of a woman." or, "This is what I think women should do." So long as the woman is knowingly and consentingly entering into that arrangement, and you're not trying to push it onto her, theres nothing wrong with it.