slm-env-unitycrawlerdynamictarget-v0 v0.0.1
SLM-Env
Build Unity environment binaries for SLM-Lab and release on npm for easy distribution.
To use a prebuilt environment, just add its npm package, e.g. yarn add slm-lab-3dball.
Installation
Building a binary requires 4 things:
- Node.js with
npm - the Unity editor, installed via Unity Hub. Go to
Unity Hub > Installs > Editor > Add Modules > Linux Build Supportto enable Linux builds. - ml-agents repo with the environment's Unity assets:
git clone https://github.com/Unity-Technologies/ml-agents.git - this repo:
git clone https://github.com/kengz/SLM-Env.git
Build a Unity Environment binary
The goal is to build MacOSX and Ubuntu binaries that can be used in ml-agents's gym API. Currently this also means restriction to using only non-vector environments.
In this example, we will use the Walker environment. We also recommend first going through the Unity Hub tutorial to get a basic knowledge about the editor. Reference from here.
Open the
ml-agents/UnitySDKfolder in the Unity editor.In the Assets tab, find Walker under
ML-Agents > Examples > Walker > Scenes > Walker. Hit the play button to preview it.Make any necessary asset changes:
- to enable programmatic control, go to
WalkerAcademyand checkcontrolin the Inspector tab. - since we're not supporting vector environments, remove the extra walker clones but selecting all but the first
WalkerPairgame objects unchecking them in the Inspector tab. - next, open the asset
Walker > Brains > WalkerLearningand in the Inspector tab, changeVector Observation > Stacked Vectorsto 1. Also, click on Model and delete it so we don't include the pretrained TF weights.
Go to Edit > Project Settings > Player > Resolution and Presentation. Ensure Run in Background (checked) and Display Resolution Dialog (Disabled).
- Now we're ready to build the binaries. Go to
File > Build Settings: - click
Add Open Scenesand add your scene - click
Player Settingsto show the Inspector tab. CheckRun in Background, setDisplay Resolution Dialogto 'Disabled'. Optionally, setFullscreen Modeto 'Windowed'. - build one for Mac OS X. Hit
Build and Runto render immediately after building. Choose the directorySLM-Env/bin/and use the nameunitywalker-v0. build one for Linux. Hit
Build, and use the same directory and name.Test the binary. First ensure you have the
mlagents_envs(version0.9.2) andgym_unitypip packages installed from ml-agents. Use the following script to run an example control loop:from gym_unity.envs import UnityEnv env = UnityEnv('/Users/YOURNAME/SLM-Env/bin/unitywalker-v0', 0) state = env.reset() for i in range(500): action = env.action_space.sample() state, reward, done, info = env.step(action)
The binary is now ready. Next, release it to npm.
Release
Note: use kebab-case naming convention with prefix
slm-envand OpenAI gym convention, soslm-env-unitywalker-v0
- Open up
package.jsonand update:
- replace
envnameas appropriate:"name": "slm-env-unitywalker-v0", - update version
Copy both the MacOSX and Linux binary files from
bin/tobuild/Release to
npm(make sure you are logged in first, bynpm login):npm publishSince the binaries are huge,
npmwill throw an error near the end of it. Just ignore that.npm ERR! registry error parsing json npm ERR! publish Failed PUT 403 npm ERR! code E403 npm ERR! You cannot publish over the previously published version 1.0.0. : slm-env-unitywalker-v0It should be available on npmjs.com, just search for your package
slm-env-unitywalker-v0.Add the release to
SLM-Labfor usage:yarn add slm-env-3dball
6 years ago