Introduction
Some time ago I was playing with the idea of adding “programming” into Space Engineers. Today, I would like to share with you my design document - still in its early version.
Please keep in mind that everything that is mentioned below is in a very preliminary stage and is subject to change.
This is a first “design draft” that needs to be specified in higher detail before we start the implementation (e.g. we need more specific GUI and HUD drawings, we need to design how the copy-pasting of programs will work, etc.).
This feature is not about modding. Programming and HUD customization will be a natural part of the game environment.
Programs could be edited, copied and pasted inside the game environment and also between players in multi-player – for example, one player writes a simple HUD program and he copy-pastes it to chat, so that others can integrate it to their HUD. A Similar idea will apply to all types of programs, but we should come up with a more user-friendly version, accessible even for the players who don’t have programming experience.
Lastly, this will be a great way for people who want to learn programming the entertaining and easy way!
Programming
Players will be able to write small programs that will be executed in computer blocks or computer components. This feature will be an integral part of the game’s environment, it’s not modding.
Computer Blocks versus Computer Components: I am still not sure if we should make a new computer block which will be the only entity where programs can run or allow programs to run in any computer component (if we allow the later, then players would write programs for doors, reactors, etc. and it could get messy).
Programming language: I would prefer some C# interpreter - instead of compiled code (DLL), mainly for security reasons. The interpreter should be object-oriented and support all standard C# features, perhaps .NET, LINQ, garbage collector, etc.
Editor: The program code can be edited from the Control Panel (similar to how we configure block attributes). Don’t forget that some blocks can have multiple computer components and player should be able to distinguish between them and program them independently.
Programs can be set ON / OFF.
Damage: when a computer component gets damaged (below operational integrity), its program is completely wiped out and lost and the program will not be executed anymore.
Execution: these small programs can be compared to pixel/vertex shaders. They have a “main” function which is executed in every update (60x per second), on server or by object-owner. They can have variables/members that keep their value from update to update – so they can hold an internal state. These variables can be accessed from other computer components (they are public by default). Each computer component is an instanced object, not a class.
Access: programs can access attributes and methods of other blocks and components, but only within one grid (station or ship). Wireless connection is not available yet. Computers within a grid can be searched by a block name (e.g. GetByName(“rotor_on_the_left”))
Code editor:
- Syntax highlighting
- Syntax check
- Suggestion wizard (e.g. after writing a dot it will offer all available members)
- Text copy-paste-delete
- Other features which are standard in code editors
- Programs can be copied to/from blueprints (program is a special type of blueprint)
Exception/error handing: (examples: NaN, divide by zero, etc). In case of error, the interpreter will stop execution of the program in the current update, but it will not rollback to pre-update state. It will run the program again in a following update but it may end up with an error again. Error info is written to “program debug window” (see scheme above), but there are no runtime crashes and assert windows. (I remember that Visual Basic used to work this way).
CPU time and memory: we need to avoid situations where players create infinite loops, programs with extensive operations, huge memory allocations, etc. Is this possible? We don’t want network clients to run extremely complex programs on servers.
Replication: There can be a function for copying/replicating code to other computer components or block from within a program (code injection). Players could write viruses :-)
Classes & inheritance: my actual opinion is that these small programs are classes but not of the same type. I will explain it by using this example: imagine having two doors in the game (door1 and door2). You write one program for door1 and another program for door2. Even though you are still working with doors, these two programs you just wrote are two different classes – they may have different attributes, state, methods, etc.
Example code:
- public float some_state_variable = 10;
- private float some_other_private_variable = 20;
- void main(void) // main function gets called on every update
- {
- var rotor1 = GetByName(“rotor1”);
- rotor1.Rotate(30);
- rotor2.Stop();
- var light1 = GetByName(“light1”);
- light.SetColor(255, 255, 0);
- }
Sample list of objects (computer components, attributes and methods):
- base class for all blocks
- IsPowered() // is electricity powering it now?
- IsEnabled()
- some access to block’s Inventory if that makes sense
- Door
- Open()
- Close()
- Light
- Color (we probably have diffuse, specular, etc)
- Radius, Falloff and other range parameters
- Type
- Thruster
- AddForce() // or something like that
Examples in other games:
- CodeSpells
- https://sites.google.com/a/eng.ucsd.edu/codespells/
- http://www.kurzweilai.net/a-video-game-that-teaches-how-to-program-in-java
- Colobot
- Botlogic
- Programming games
Possible visual/text programming language: http://vvvv.org/
Computercraft http://www.computercraft.info/
Proximity sensor
Proximity sensor is a new type of block that can be placed on top of other blocks (similar to how we put interior lights) that scans the cone area on its front side. Large and small block sensors will look the same and will have the same proportions.
Scanning: sensor keeps scanning a cone area in front of it with every update (60x per second) and whenever it detects an object (moving or stationary – this depends on the velocity threshold) - an event function on sensor will be triggered –> on_detect() –> this function is executed inside the movement detector computer component.
Detected objects: the sensor detects all types of objects - blocks from its own grid, asteroids, ore, small items, even animated doors or player or rotor or active thrusters... perhaps even particle effects.
Inspiration: http://en.wikipedia.org/wiki/Proximity_sensor
Parameters:
- Range - from 0 to 300 meters (Note: consider performance implications and decrease if necessary)
- Angle - max 180 degrees
- Laser projection of scanned area (player can observe the scanned area; transparent red polygons, similar to what we had in Miner Wars’s scanners)
- Velocity threshold (if this is easy to implement) – if set to zero, even static objects are detected. If more than zero that only objects with higher velocity are detected.
- Visual signaling – true/false – when true and object is detected, sensor will blink
- Energy consumption – a little higher than spot lights; depends on angle and range
3D model: inspired by semi-spherical ceiling light; example:
HUD programs
This idea is very similar to “block programming” – however, instead of writing simple programs that run in blocks, HUD programming is about writing simple programs that run in the HUD and their output can be displayed on player’s HUD screen.
HUD programs can access (read and write) only the information that’s accessible to the player:
- astronaut info (health, energy, velocity, etc.) – available at every moment
- ship and station info – only when sitting in a cockpit or connected to terminal/control panel (although at this moment the player won’t see the HUD because the GUI screen will be on top of it)
- remote – once we add wireless communications, players would be able to “communicate” with ships and stations even when not sitting in a cockpit
Current HUD: I am talking about displaying character and ship parameters such as speed, energy, etc – we need to redo them as a customizable program that players can modify. This should be easy and it will also show how to use this feature.
GUI versus HUD: In my opinion the difference between the GUI and the HUD is that the GUI is also for things that are not part of the game world (e.g. Save, Exit to Main Menu, Options, etc.). The HUD should be only for things that are an integral part of the game world.
Storage: HUD programs are per-world and per-player (so they are stored in the world file and are specific to player – even in multi-player). Example: if player wants to use his cool new HUD in some other world that runs on a dedicated server and where this player is not even an admin, he can copy-paste it there through the game’s GUI/HUD and use it.
Storage: HUD programs are per-world and per-player (so they are stored in the world file and are specific to player – even in multi-player). Example: if player wants to use his cool new HUD in some other world that runs on a dedicated server and where this player is not even an admin, he can copy-paste it there through the game’s GUI/HUD and use it.
CPU and Memory: HUD programs run in every update (60x per second) and there are no limits on how complex HUD programs can be written – if the player makes an error, only his computer will be affected by low performance
Errors and debugging: errors are ignored (no crashes) and error messages are written to the HUD screen so that player can instantly see if there’s a problem.
Example code:
- public float some_state_variable = 10;
- private MyGuiSlider slider1;
- private MyGuiLabel label1;
- // called only at start of HUD or whenever HUD program gets changed
- void init()
- {
- slider1 = new MyGuiSlider(...position...default color...etc);
- label1 = new MyGuiLabel(...position...default color...etc);
- }
- // main function gets called on every update
- void main(void)
- {
- GetByName(“player character”).Color = slider1.Value * 100;
- label1.Text = MyUtils.GetFormattedFloat(slider1.value, 2);
- }
Editing: HUD programs are accessible from the terminal screen (control panel) under a new tab of HUD. The player can see a list of all active HUD programs there, modify the position, shortcuts, etc.
Focus: HUD screens don’t have keyboard/mouse focus unless player presses the shortcut key, after which everything under the HUD screen gets darker and player can access the GUI controls on the HUD screen (e.g. move slider). Player will leave this mode after pressing shortcut again (or ESC).
Controls: all standard GUI controls should be available for HUD programming: textboxes, sliders, labels, buttons, comboboxes, etc.
---
Please keep posting your feedback and suggestions to the comments section below. I can’t reply to every comment, but I can assure you that I try to read as much as possible and your comments will influence how Space Engineers develops.
EDIT: You can also participate in the forum discussion about this topic: http://forums.keenswh.com/post/programming-in-space-engineer-discussion-6930159?pid=1283054218#post1283054218
Older forum discussions regarding this topic:
http://forums.keenswh.com/post?id=6786260
http://forums.keenswh.com/post?id=6925004
Thanks!
Marek Rosa
---