Archive for October, 2007

Ubuntu: Changing the default sound output device

Tuesday, October 23rd, 2007

Here’s a quick example of how you can change your default output device from /dev/dsp to /dev/dspX.

All you need to do is drop the following lines into ~/.asoundrc; you may need to restart your window manager.

pcm.!default {
type hw
card X
}
ctl.!default {
type hw
card X
}

Replacing X with the device number. (I.e. use 1 if you want to use /dev/dsp1 as your default output device).

Copter

Tuesday, October 23rd, 2007

How long can you keep the copter going?

Time Lapse Video

Sunday, October 21st, 2007

Ever since I moved into my new flat, a server has been taking a picture out of my back window every 60 seconds.

FlatView was released shortly after I moved in, offering pseudo-realtime access to the data.

Every picture taken has been saved, and converted into a video!

/files/time-lapse-60days.avi (208MB)

This contains the last 60 days of data, as this is roughly how long I have been living at my new flat.

I have found the sunrise, sunset and night time to be the most interesting times.

1: Graphical Output and Input

Monday, October 8th, 2007

Lecture Notes

This first lecture on Graphics by Duncan Gillies covered:

  • Device Independent Graphics
  • Raster Graphics
  • Normalised Device Coordinates
  • World Coordinates
  • The Normalisation Transformation
  • Output Primitives
  • Input Primitives

Device Dependence

Wherever possible, applications should be developed to be device independent. To provide device independence, Graphics APIs (Application Programmer’s Interfaces) have been developed. Examples of such APIs are OpenGL and DirectX. These APIs (well, OpenGL specifically, since DirectX is a Microsoft API) allow a developer to create a graphics application that can work cross-platform on the majority of modern Operating Systems.

Raster Graphics

Most graphics devices are of Raster type, and allow the developer to plot points. The developer must specify the X and Y coordinate and the colour of the point that they wish to plot. For Example:

setPixel(XCoord, YCoord, Colour);

This leads to two problems:

  • What happens on different screen resolutions or application window sizes?
  • What if a different operating system addresses pixels differently?

World Coordinate System

World Coordinates are “Real World” coordinates that are mapped onto the pixel coordinates of a window. For example, an Architect may choose to use meters as a unit of measurement instead of pixels. The World Coordinates can be specified for a particular window using code similar to:

setWindowWorldCoordinates(WXMin, WXMax, WYMin, WYMax);

This is merely chosen for the developer’s convenience, and “World Coordinates” can mapped onto pixel coordinates at a later stage in the application before the pixel is modified. This allows the application to be size and resolution independent.

Graphics Primitives

Now that an internal system of points has been defined (the World Coordinates), the developer can now draw pictures using commands like:

drawLine(x1, y1, x2, y2);
drawCircle(centreX, centreY, radius);
drawText(x, y, “Text”);

Parts of the drawing that are outside of the application window are clipped so that a picture will not effect outside its’ window.

Device independent graphics systems extensively use attributes. These can be used to set colours and fonts, where it would be tedious to have to specify the data on every command.

Normalisation

In order to implement a World Coordinate system, we need a method of mapping World Coordinates to actual pixel coordinates. Doing this is fairly straight forward. Firstly use the graphics API of your choice to obtain the maximum and minimum pixel coordinates:

getWindowPixelCoordinates(DXMin, DXMax, DYMin, DYMax);

Then, translate your World Coordinates into Pixel coordinates using the following translation:

Xd = Xw * A + B
Yd = Yw * C + D

Where:

A = (DXMax – DXMin) / (WXMax – WXMin)
B = -WXMin(DXMax – DXMin) / WXMax – WXMin) + DXMin

With similar equations for C and D, merely using the appropriate maximum and minimum Y values.

Viewports

Some graphics APIs allow for further separation of windows by providing viewports (subsections) of the display window.

Input Devices

The most important input device with respect to graphics, and the only device that this course really covers, is the mouse. This device records:

  • Distance moved in X Direction
  • Distance moved in Y Direction
  • Status of Buttons

The mouse causes a system interrupt every time it is moved or a button is pressed, and is is the responsibility of the application to track this movement and to display a pointer or marker.

Callback

The Operating System must share control of the mouse with the application since it has to handle events outside the application window. It therefore traps all mouse events and then passes them on to the application. The application must, after every action, return to a callback procedure (or event loop) to determine if any mouse events have been happened and, if so, to process them. A sample callback procedure may look something like:

while (executing) {
if (mouseEvent()) processMouseEvent();
if (windowResizeEvent()) redrawGraphics();
// do stuff
}

Device Independent Input

Typically, a user will move a pointer on the screen and use this to click on a button. This identifies a pixel, and for device independent input, its address can be translated into a users world coordinate system.

Two forms of visible marker are common. The first is called a Locator and is typically implemented as a cross-hair or pointer. The second is called a rubber band, and is indicated by a line or box originating from a fixed point to the point indicated by the mouse register. In most cases, this functionality is provided by the system software.

Facebook!

Saturday, October 6th, 2007

Andy Millar's Facebook profile

xkcd on relationships..

Saturday, October 6th, 2007

Group Project Assignment: Event-driven massively-multiplayer gaming platform

Friday, October 5th, 2007

Ok, so we’ve now been assigned our group projects. We have roughly 3 months to complete the project, with an estimated 8 weeks coding time.

The group is:

  • Me
  • Jack
  • Jee
  • Marten
  • Sean

The “brief” for the project so far is:

Nutshell: build a massively-multiplayer game, from scratch, using a pub/sub network as a foundation. Build a simulator to study its performance and scalability.

Background:

In a massively-multiplayer interactive game, thousands of players around the internet interact in a shared, virtual environment. For this experience to be truly “immersive”, interactions between players need to be rendered accurately within a few tens of milliseconds. The goal of this project is to explore a decentralised approach to achieving this, by building on top of a “publish-subscribe” network. The pub/sub network infrastructure delivers events about the state of game agents to other agents on a “need-to-know” basis; an agent subscribes by specifying the class of events in which it is interested. Players only subscribe to events from agents whose behaviour is visible or might affect them. The objective of this project is to build the experimental infrastructure to explore performance issues in such a game, and in particular, to explore the performance issues involved in the pub/sub routing infrastructure – in fact, we aim to produce a pub/sub benchmark program. Which is also hopefully a motivating, even entertaining, game.

Breakdown:

This project has a number of elements:

  • Client game engine (actually it would be fine to use an existing engine here, preferably open source and cross-platform.
  • World/level generator. We need to be able to generate non-trivial virtual worlds algorithmically, so that we can automatically scale the size of the game and the number of players.
  • Virtual player AI. As well as human players we need robot players, both to make the game interesting and to be able to run automatic performance tests with large numbers of simulated players.
  • Publish/subscribe message routing infrastructure. We need a simple design that can be made very very fast – but which is extensible to include game-specific event filters (such as level-of-detail [“he’s too far away to see or hit me”], or clipping [“he’s behind a wall”]).
  • Network simulator. We need to evaluate the scheme using a large network, together with robot players and servers, to study the performance of the game – to show message volume and interaction delay increases with number of players (and world size etc). I suggest an event-driven simulation harness that coordinates execution of client, broker and server code running as separate processes.

This is a big project, so our objective is to simplify as much as possible – and to structure it so there are several partially-separable parts. It has the potential to be quite a lot of fun.