mindbleach

joined 2 years ago
[–] mindbleach@sh.itjust.works 2 points 11 hours ago (1 children)

The best song on a damn good album about a very silly thing.

[–] mindbleach@sh.itjust.works 2 points 11 hours ago

Mike comes very close to the real story here. There's people posting one fake trailer per day, and if their output becomes indistinguishable from a real trailer... why not make a whole movie? Yeah yeah yeah, presumably starting with knockoff clickbait trash, written by a robot, starring actors who will sue them into the dirt. But there's no technical separation between that, and making shit up from scratch.

Any film student with a thousand bucks and a first-draft script can edit together an "animatic" that's basically a finished film. It's like their stock-footage library had exactly what they wanted, every time.

Or they can hire actors like normal, to give real performances, and have the robot make up a billion dollars in CGI.

This does not go well for studios.

[–] mindbleach@sh.itjust.works 8 points 15 hours ago (1 children)

'We want to be clear: we're assholes, not fascists.'

[–] mindbleach@sh.itjust.works -1 points 17 hours ago

"... but AI steals!"

[–] mindbleach@sh.itjust.works 4 points 19 hours ago

Engineers don't let engineers design interfaces.

[–] mindbleach@sh.itjust.works 1 points 20 hours ago

Training is transformative use.

From-scratch models with public-domain data would still function.

All commercial works before 1995 should be public domain anyway.

Corporations forcing this tech on everyone is not a problem with the tech.

Any measured take on this whole stupid industry starts to feel like "enlightened centrism." None of it is so morally simple or disconnected that responding with YAY or BOO makes any damn sense. The cost of whiz-bang science-fiction technology is that every science fiction story is about the consequences of that technology. Worrying possibilities are the genre.

And yet: whiz bang. We have programs that can emit video just by describing it - that's fucking awesome. That's how computers work when authors don't know how computers work. We have programs which, just by guessing the next word, are debatably adequate writers, editors, coding partners, translators, et very cetera. They're not perfect at anything you tell them to... but they'll do anything you tell them to. Vast swaths of "[blank] requires true intelligence" went right out the window.

Legitimate concerns abound, but the loudest complaints include 'the robot took books from the library.' That's what libraries are for. Be mad that these assholes let it spy on you, not that it also read bestselling novels.

Similarly, there is no such thing as "digital borrowing." The concept is fundamentally impossible. It's a copy.

[–] mindbleach@sh.itjust.works 7 points 1 day ago (1 children)

This man, in particular, is genuinely not allowed to acknowledge that. Ze Germans overshot 'don't be antisemitic' so hard that they'll jail Semitic people for pointing out a foreign government is trying to exterminate a Semitic people.

[–] mindbleach@sh.itjust.works 16 points 1 day ago (1 children)

I'm shocked it doesn't happen more often and more blatantly.

[–] mindbleach@sh.itjust.works 13 points 1 day ago (3 children)

You know. You're just not allowed to say it.

There was a tech talk about Quake 3 - surprisingly not by Carmack - which highlighted how id is technologically conservative. Quake 3 was their first game with no software renderer, and it still asked for nothing but OpenGL 1.1 and a Pentium II. It somehow featured curved surfaces, volumetric fog, and believe it or not, shadow volumes. Dynamic lighting was all Blinn-Phong from 1977. Static lighting was one (1) 128x128 lightmap, for dramatic gradients all over a stadium-sized level.

Doom 3 had steeper system requirements because it still did shadow volumes on the CPU and re-rendered most of the scene for each of them. But if you look at a screenshot with lighting disabled, the polygon count was closer to Quake 2, with bump maps adding all of the detail.

Rage, with its unique texel for every square inch of its gigantic world, could both run on an original iPhone and stream from a DVD on Xbox 360.

So it's really fucking weird to see them demand raytracing. Look: I've been following real-time raytracing on GPUs since 2009, when Ray Tracey on Blogspot coerced it out of his GTX 300-series. It seemed like an obvious choice, once we figured out how to use fewer rays. (Blending with past frames was an ugly kludge; obviously that wouldn't continue.) I had mixed feelings when Nvidia made it yet another proprietary anticompetitive gimmick. I do not understand how modern cards have hardware specifically for this thing - and it still chugs. It just uses more rays. Like your low-frequency indirect lighting needs multiple samples per-pixel, instead of updating some probes.

Quake 3's volumetric fog used naive raymarching. By the PS3 era we'd figured out you can do it badly, per-pixel, and then blur. OpenGL 1.1 didn't do "blur." OpenGL 1.1 barely "per-pixel." id Software did it the hard way, in tiny steps, on the CPU. And yet it still ran great, because they did it per vertex, and blended across wobbling triangles, and it looked fucking great.

I'm tempted toward an "eat hot chip and lie" rant about modern developers who can't imagine doing anything only a thousand times per frame.

view more: next ›