Minimal devcontainer with Codex for messing around with AI stuff

This is my minimal devcontainer for playing around with AI stuff, using Codex. It includes Python, uv, Node, gh and the Codex CLI. Running in a devcontainer gives me a clean environment to mess with and further sandboxes the coding agent.

To use create a .devcontainer directory in your project and add the following devcontainer.json file:

{
	"name": "Minimal Codex And Dev Container",
	"image": "mcr.microsoft.com/devcontainers/base:trixie",
	"features": {
		"ghcr.io/devcontainers/features/github-cli:1": {},
		"ghcr.io/devcontainers/features/node:1": {},
		"ghcr.io/devcontainers/features/python:1": {},
		"ghcr.io/devcontainers-extra/features/uv:1": {}		
	},
	// -i to force interactive shell which nvm requires
	"postCreateCommand": "bash -i .devcontainer/complete_setup.sh"
}

Then add the following complete_setup.sh script to the same directory:

#!/bin/bash

# install latest node
nvm install --lts && nvm use --lts 

# install codex
npm install -g @openai/codex

Then launch your devcontainer. I use VS Code to do this.

Finally run codex with:

codex -c sandbox_workspace_write.network_access=true --search

This gives codex access to the network and ability to perform web searches.

Google IO 2016 Highlights

Last month Google held it’s annual Google IO conference. Not really a surprise but it was chock full off conversation and AI based tech which is all the buzz of late.

Three bits in particular stood out for me.

The first is the demo of Allo, their new chat app. The app itself wasn’t particularly groundbreaking but the embedding of their new digital assistant tech (the Google assistant) is very interesting and it really shows where Google thinks things are going. The fact that 3rd parties will be able to hook into the Google assistant ecosystem, as shown with the Open Table integration at the end of the clip, should have organizations interested in conversational commerce buzzing.

Here is the clip:

The second demo was for Google Home, their answer to the Amazon Echo. I suspect there was a bit of hand waving going on in this video clip but having done some work with the Echo, everything they demo should be doable. The one thing that did not ring true for me was how it was identifying specific family members so that it could perform actions in the context of their data and “profile”:

The last thing that stood out for me was the new Android Instant apps tech. It basically allows a sliver of your app to be downloaded and run, which means that your users don’t have to first install your app. They demoed apps been launched from web searches and from an NFC tag scan. Really starts to blur the difference between an app and a web page. Here is the clip from the key node:

All in all, some pretty interesting tech will be hitting us shortly.