this post was submitted on 06 Jun 2025
168 points (98.8% liked)

Game Development

4451 readers
16 users here now

Welcome to the game development community! This is a place to talk about and post anything related to the field of game development.

Community Wiki

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] nlgranger@lemmy.world 14 points 6 days ago* (last edited 6 days ago) (2 children)

As an engineer, Z up right handed is the only one that makes sense. All engineering follows this convention, motion sensors (example) or GPS topography will use that.

Cameras calibrations also use right hand but typically rotated such that it's X right, Y down, Z depth. That way, the projection formula give the pixels column, row and depth values.

camera coordinate system

[–] calcopiritus@lemmy.world 19 points 6 days ago (1 children)

IMO there's no correct answer. It depends on the purpose.

Intuitively XY forms the plane where things are and Z is used for the "additional dimension". Since first we think of 2D, and then we move to 3D, so it's the natural way we think.

So if you make a cities skylines type 2D game, the XY plane will form the ground. Which means that when you want to make the game 3D, Z means up.

But if you're making a platformer game (like old marios), you have XY being the left-up plane, like when drawing a graph in a paper. Then if you want make your game 3D (as in, having 3D models for assets, the movement would still be the same 2D). Then the Z will be forwards.

Sometimes, even Y being down makes sense. Like with UI programming. This is specially important for window resizing. When you resize the window, you want everything to look roughly the same, but with more space. That is easier if you use (0,0) in a point that doesn't move when resizing, like the top-left corner. Since the bottom-left corner moves if you make the window taller.

I think the preference for Y being the vertical axis in gaming, comes from a legacy of orienting the work around screen-space (roughly 50 years ago). It was more efficient to have a memory layout that is easier to supply data to generate a TV signal. Since a CRT raster goes from upper-left to lower-right, line-by-line, that became the standard for screen-space: inverted y. So screen memory (or however the screen is represented in RAM) starts at address/offset zero, and increments from there. All the other features like hardware sprites, used the same convention since it was faster and simpler to do (easier math).

When consumer 3D cards were relatively new, this was inherited, most notably when the hardware started to ship with "z-buffer" support. Carrying all this baggage into 3D engines just made sense, since there was never any confusion about the screen-orientation of it all. Why bother flipping axes around from world and camera space to the screen?

[–] soulsource@discuss.tchncs.de 7 points 6 days ago

In Physics we mostly used right-hand, but X-right, Y-up, and Z pointing towards the viewer.

But that's details. The only important choice is between left- and right-handed, as that affects the signs in the cross product (and some other formulas - generally everything that cares about which rotation is considered positive).

[–] flatlined@lemmy.dbzer0.com 6 points 6 days ago* (last edited 6 days ago)

Hah, reading this not too long after watching Freya Holmer's latest video/talk is fun. "Everybody is disagreeing, except everyone is agreeing that unreal is wrong". The image is by her (it's also used in the presentation). And she mentioned wanting to make it easier/possible to change handedness. She'd already had an 'apology' from Sweeny for the Unreal configuration. Interesting to see they're taking it further than that.

[–] DragonTypeWyvern@midwest.social 3 points 6 days ago (1 children)

Every day we stray further from God's light.

[–] Kissaki@programming.dev 2 points 5 days ago

Let's call the axes g o and d.

[–] Gobbel2000@programming.dev 35 points 1 week ago (1 children)

Ah yes, the standard Minecraftian coordinate system.

[–] usernamewastaken@lemmy.world 9 points 6 days ago* (last edited 6 days ago)

tbh at this point most people taking a linalg class or getting into game programming are probably very familiar with the minecraft coordinate system

similarly to how minecraft has taught an entire generation basic english

[–] 2xsaiko@discuss.tchncs.de 29 points 1 week ago
[–] RightHandOfIkaros@lemmy.world 15 points 1 week ago (3 children)

I sure hope Blender does this soon. Very annoying workflow using it with Y-up software.

[–] Hadriscus@lemm.ee 5 points 6 days ago* (last edited 6 days ago)

There were discussions recently about the feasability of changing Blender's coordinate systems. The answer was that it was highly unlikely to happen because there was no single place where it was defined -instead, it was assumed throughout the code that Z was up, in probably tens of thousands of lines... and the work to regularize that would be gargantuan with no clear benefit -since exporters and importers are supposed to handle these transformations anyway.

[–] ulterno@programming.dev 10 points 1 week ago (2 children)

In case of Blender, I'd say, it could be user choice.
But I like Z up right handed because it matches the convention used in high school physics.

Rather, why don't Y-up software change to Z-up, instead?
Were they taught to use Y-up in Kinematics-3D? I doubt that.

[–] SorteKanin@feddit.dk 2 points 6 days ago (1 children)

Y up makes more sense for 2D I guess.

[–] ulterno@programming.dev 2 points 6 days ago* (last edited 6 days ago)

Yes.
And that's why I once previously came up with an assumption that a game engine having Y-up as default, probably started as a 2D game engine

I remember getting downvoted for that

[–] RightHandOfIkaros@lemmy.world 5 points 1 week ago (1 children)

Z-up is really only common among architectural or engineering type software. CAD and other types of software, for example.

Y-up is common among basically all other types of software. DirectX API is Y-up, for example. Which means anything that interacts with DirectX that has Z-up needs to convert the axes first before doing its calculation (a literal nothing cost microoptimization, but could be big depending on the software and platform).

It's probably fine to leave as is in the long run, but its just annoying to me that I need to convert between the axes in Blender export settings, and would be more convenient if all the software I use used the same system, which is now Y-up. Blender is literally the only one. Honestly, I agree that it should be a user setting in Preferences, but I dont know if Blender is programmed to handle this or if it would need to be entirely rewritten.

[–] ulterno@programming.dev -2 points 6 days ago

Well, considering that Kinematics 3D ^[an academic subject, which should supposedly be your first introduction to a 3 dimensional coordinate system after the pre-introduction back in Kinematics 2D and pre-pre-introduction when learning the number line. i.e. If the academic curriculum was sensible] uses Z-up and the main reason for Y-up being the 2D monitor having been the primary target for graphics output (where X and Y would have been taken and mapped first, requiring Z to be added when converting the paradigm to 3D) back when these things were started and us, slowly transitioning to having the same application being usable with both - AR/VR tech and monitors^[which would mean that in some cases, the user might be seeing some axes-gimbal, requiring a translation layer later anyway and in this case on the application developer's side, which makes it a cognitive load], we might as well all go with Z-up from the start.

[–] Cricket@lemmy.zip 2 points 1 week ago* (last edited 1 week ago)

~~I haven't used Blender in a long time, but the graph in the OP says it's Z-up?~~

Edit: never mind, I misunderstood what you wrote!

[–] popcar2@programming.dev 14 points 1 week ago

After all those years... I can't believe it...

[–] match@pawb.social 10 points 1 week ago

i enjoy that Minecraft is its own software suite

[–] egerlach@lemmy.ca 9 points 1 week ago (3 children)

Okay, am I understanding this correctly? In Y-up right-hand, positive X is "to the left"?

I personally think that right-hand Z-up makes the most sense, but it makes sense to move to where most of the industry is going.

[–] Reddfugee42@lemmy.world 5 points 1 week ago (2 children)

Z is depth. Why should it ever be up? Honest question.

[–] nlgranger@lemmy.world 2 points 6 days ago

For cameras, the convention is Z depth but right handed:

  • X rightward / pixel column from the left
  • Y downward / pixel row from the top
[–] egerlach@lemmy.ca 6 points 1 week ago (1 children)

It depends if you view the X-Y plane as the camera/viewport/screen or the ground. If X-Y is the ground, then Z is either up or down.

[–] Reddfugee42@lemmy.world 1 points 6 days ago
[–] bjoern_tantau@swg-empire.de 4 points 1 week ago (1 children)

From the chart I would guess that in Y-up right-hand positive X is to the right. You look at the palm of your hand.

That way when you develop a 2d platformer you would use a standard XY coordinate system. Switching to 3d would logically add the z axis as depth and not height. A movie is usually shot that way as well.

Of course that analogy breaks down as soon as your base-game is top down. Like a city planer or so.

Anyways, with standards it's often best to just go with what most others do. So kudos to Unreal for not being stubborn.

[–] Philippe23@lemmy.ca 1 points 1 week ago

He references "Left-Up-Forward", so X is Left, Z is into the screen.

2D is not a consideration, even if it'd be logical. (That's where Left-handed Right-Up-Forward grows out of.)

I don't think that's correct. Here's a drawing I did when trying to get my head around this.

drawing

I find that trying to make sense of terms like "to the left" tricky when we can rotate the directional cube any way we want. For example, in my drawing for "Y-up, left handed", the red X axis is pointed leftwards. However, we could rotate the unit vector cube so that the X axis is pointed right, and the Y axis is pointing up (i.e. the orientation we're most familiar with for 2D graphs). The Z axis would then be pointing away from us, into the plane of the paper/screen.

In contrast, if we oriented the Y-up right-handed cube in the same way, then the Z axis would be oriented as if to come out of the plane of the screen/page, towards us.

These distinctions only matter when we add a third dimension, so the left or right handedness is basically a question of "when we add the third axis to a 2D square made by the other two axes, does the third axis come towards us or away from us? I apologise if this hasn't made things any clearer — I am able to make things make sense by imagining the rotations in my head, but not everyone is able to visualise them like that.

[–] brisk@aussie.zone 2 points 1 week ago (1 children)

Was this posted somewhere other than twitter?

[–] Justdaveisfine@midwest.social 3 points 1 week ago

I have not seen the mention of base unreal engine outside of Twitter, but the change for UEFN has been noted elsewhere:

https://forums.unrealengine.com/t/we-re-moving-to-the-left-up-forward-luf-coordinate-system/2540901

[–] brisk@aussie.zone 2 points 1 week ago
load more comments
view more: next ›