For starting out in game programming, I recommend the Godot game engine. It uses it’s own GDscript which is similar enough to Python to be useful to learn. It includes a large library of game functionality, an ecosystem, tools for art and assets. The documentation is great and there are getting started tutorials.
Ultimately I wouldn’t worry too much about choice of languages, framework, SDK. Whatever you pick might not apply to the jobs you find. The best language and engine are the ones you will follow through on and get over the initial learning curve. After that you can adapt up anything.
Of course, another option is to just stay independent, and make and sell your own games, and not worry about anyone hiring you. Running your own business is tough, and it’s definitely not for everyone, but it’s probably easier than it’s ever been in this era, due to Steam for distribution and a bunch of free engines and toolkits available. Some of my favorite games from the past decade were created by basically one-man teams (sometimes with a little help from one or two other people for things like background music). For a lot of these games, the assets are ludicrously simple, like something a child would draw in MS Paint, but as long as the gameplay is good enough, that’s OK.
My contribution here would be to say, if you’re a newbie, don’t learn a language with a specific career role in mind. Learn a language to learn how to learn a language. Learn a language to achieve some basic task that you’d like to automate. Or learn a language with some specific learning goal in mind. Even if it doesn’t turn out to be your productive career language, knowledge of the differences will make you a better programmer all around.
With that said, if you have no goal in mind other than to get started, I would suggest starting out in C. Some have said it’s like a tour through the history of programming. You’ll find that many languages actually have C bindings under the hood, and you’ll learn what those languages do for you that C doesn’t, and where those conveniences can become limitations or bloat. The goal wouldn’t be to become an expert C programmer, rather just to gain basic competence. It would be analogous to learning the piano before learning other musical instruments. Many people don’t, but for those who do, it’s an incredibly helpful foundation.
So pick up a few. In addition to C, try Python, as currently it’s seen as an all-around “golden hammer” that you can use for many kinds of problems. I’m using it for some ML work right now. Likewise JavaScript is ubiquitous, and you can actually do some productive browser-side stuff in it, like maybe custom browser plugins. And just for fun maybe take a spin through Lua, which is a scripting language used in many games for tasks that don’t need the power of C++.
^ the above, especially the first paragraph. I spent 40+ years avoiding honest work in IT, and one of the most essential qualities I’ve discovered in good developers is the ability to be a “computer whisperer” — that is, to be able to look at a problem or process, not as a human would, but as a computer would*. If one has that, specific language or syntax, while not irrelevant, is secondary.
At one point I taught Introduction to Computers at a local technical college, and when it came to the section on programming I started out by asking the class “What’s two plus two?” Except for an occasional wiseacre, the answer was inevitably “four.” “How did you get that answer? Did you have to count on your fingers?” “No, I just know it.” “Well, guess what? The computer doesn’t know it. You learned through repetition to the point that the answer is pretty much instinctive; but the computer doesn’t have that instinct, so every time you ask, it has to run the calculation again.” Granted that’s an oversimplification, and becoming a tad obsolete in the age of AI and heuristics; the point, however, is that humans use a lot of mental shortcuts which aren’t available to the computer, and programming requires breaking them down into discrete steps that the computer can process to get the desired result.
EXIT HIJACK AND RETURN TO MAIN DISCUSSION
* Yes, I’m anthropomorphizing the computer. But it’s simply for illustrative purposes.
Yes, I used to compare programming to playing Lemmings; doing something useful based on just stacking “dumb” commands. And since then there have been games that are pretty much directly this premise, like 7 billion humans.
But yeah, ISTM quite different to coding these days. On complex projects like modern games, most people necessarily need to be working at quite a high level of abstraction a lot of the time. Not saying this is bad or good, just a difference.