GPT version of Super Mario is here – brings back childhood memories

Super Mario

What kind of experience is it to use text to generate game levels to play by yourself? Let’s take a look at GitHub’s hot list project “GPT version of Mario”. Voila, all you have to do is type “more pipes, many enemies, some blocks, low elevation” into the text box:


Click “Generate level” to get your own Mario game:


The left side is the play area, you can play directly by pressing the arrow keys and a, s, d keys to control. The right side is the overall rendering generated according to your requirements.

Feel free to set these few options to unlock more styles.

For example, with fewer obstacles:


Or with fewer pipes and more obstacles:


This wave is full of childhood memories and although the game may not be interesting anymore, the memories keep you going. It is important to note that the effects you see above are all based on GPT – 2. The big language model has made a contribution again~

The model is now able to comprehend commands in common English. In addition to those above commands, it understands other commands like “many pipes and many enemies”, and “no pipes, no enemies, many blocks”…thanks to the labelled input.

Other prompts that do not exist in the data set include “many pipes, no enemies, many blocks”. Also, a failure case include “many pipes, no enemies, some blocks”

Mario GPT

Gizchina News of the week

Generate Mario levels with GPT2

The model behind the project is called MarioGPT. It is the first model based on text to generate game levels (text2level), fine-tuned on GPT2 (distilgpt2). The authors are from Copenhagen University of Information Technology. Its training material includes Super Mario Bros and Super Mario Bros: The Lost Levels levels, provided by The Video Game Level Corpus.

The specific principle is shown in the figure below:


Like GPT2, MarioGPT can predict the next token sequence. The level in it is represented as a string, which will tokenize the encoder (Byte-Pair Encoding) through a byte. Levels are broken down by column and flattened into a single vector (or multiple levels into a batch of vectors). In order to incorporate the info entered by the user, the author arranges a frozen text encoder for MarioGPT. This comes in the form of a pre-trained bidirectional LLM (BART).

At the same time, the average hidden state of the forward propagation of the model is output here. Finally, it is enough to feed the output state into the cross-attention layer of the GPT2 architecture and combine it with the actual level sequence passed into the model. Regarding the effect of MarioGPT, the author expressed surprise. This is because 88% of the results it finally generates can be used to actually break through the level.

How to play?

Since MarioGPT is already open source, you can also download and experience it yourself. After ensuring that the computer has installed python version 3.8+, use the pip command or git:

“pip install mario-gpt” or “git clone >python install”

Generating a level requires at least the code in the image below:


The author also provides a more in-depth tutorial on the project. To try out the generated levels yourself, you can follow the steps below

(1) Play the demo on Huggingface. It can even generate arbitrary levels by selecting options such as “more” or “less” on each element without you entering text.

(2) Control by code: use the play and astar functions, provided that your computer has Java 8+ installed.

Those who are interested can try it out. At the moment, MarioGPT has more than 500+ stars.

Instead of one of these cutting-edge talking AIs, MarioGPT is built on GPT-2. These big language models are general-purpose pattern detection and replication engines. Thus, they are excellent at more than just ingesting words in sentences like these and producing more like them.

“We honestly just picked the smaller one to see if it worked!” said Shyam Sudhakaran, lead author on the paper, in an email to TechCrunch. “I think with small datasets in general, GPT2 is better suited than GPT3, while also being much more lightweight and easier to train. However, in the future, with bigger datasets and more complicated prompts, we may need to use a more sophisticated model like GPT3.”

Even a very large LLM won’t be able to grasp Mario levels directly, so the researchers had to first convert a selection of them into text. This resulted in a Mario variation akin to Dwarf Fortress.

Super Mario

Super Mario is a classic 2D side-scrolling platform game developed by Nintendo. The player controls Mario, a plumber who must navigate through various levels, defeating enemies and avoiding obstacles to rescue Princess Toadstool from the evil Bowser. This game brings back childhood memories for many people.

Mario can run, jump, and stomp on enemies to defeat them. He can also collect power-ups such as mushrooms, which make him grow in size, and fire flowers, which allow him to shoot fireballs. Mario must avoid falling into pits, getting hit by enemies, and running out of time, as each level has a time limit. The game features a variety of enemies, such as Goombas, Koopa Troopas, and Piranha Plants, as well as bosses at the end of each world. There are also hidden secrets and bonus areas for players to discover. The game has become a cultural icon and has spawned numerous sequels and spin-offs, as well as influencing the platformer genre as a whole.

Source/VIA :
Previous Apple’s AI Summit Was Mostly a Team-Building Exercise, No ChatGPT Competitor at Sight!
Next Nintendo is reported to be focusing on the Switch 2 console