Pat Ambler wrote this in Jan 2009 after interaction with Bob Fisher. It still needs work done on it.
In the late 1960's the group working with Professor Donald Michie in The University of Edinburgh's Department of Machine Intelligence were interested in writing programs that tackled problems of interaction with the real world. After some initial experiments with simulation of the real world (A Sloman?) it was decided that it would be much easier to use the real world, building a robot equipped with cameras, than to try to simulate it. This led to the development of a series of 'Freddy' robots, culminating in the Freddy Mark 2 robot, now in the Royal Museum of Edinburgh.
Freddy Mark 1. 1969-1971.
1. Barrow, H G & Salter, S H; Design of Low-cost Equipment for Cognitive Robot Research, Machine Intelligence 5, Edinburgh University Press, 1970.
2. Salter, S H; ; Arms and the
Robot, Bionics Research Report No 9 (1973), Bionics Research Laboratory, School of Artificial
Intelligence, University of
Freddy Mark 1, was built in 1969 as an experimental device, designed to try out ideas before building a final machine. It rested on a small platform (3ft diameter), which, in turn rested on three steel balls. The device pushed its platform around by means of small wheels underneath it. It had two bumpers, with micro-switches, which allowed it to detect obstacles on the platform, and its main body was a TV camera mounted vertically just above a 45degree mirror. It was connected to an ICL 4130 under the Multo-Pop time-sharing system. It was used to develop a 'teachable' program capable of recognizing irregular objects. Jean Hayes (Donald Michie's partner at the time) referred to this version as 'the arthritic Lady of Shallot'.
Freddy Mark 1 was housed at
Freddy Mark 1.5. 1971-?
Reference: Barrow, H G & Crawford, G.F; The Mark 1.5
Freddy Mark 1.5 was built by Salter, Crawford and Barrow (with help from Ken Turner) and went on-line in May 1971 as a complete hand-eye system. It had a 2m square platform which moved bodily in two directions, like a giant x-y plotter, with a bridge over it. The hand was suspended from the bridge over the centre of the platform's movement. An oblique camera was mounted on the side of the bridge, and an overhead camera was mounted ontop of the bridge, looking vertically onto the platform. The 'hand' resembled a pair of hands and arms.There were two vertical, parallel plates, or 'palms' which could be driven towards each other to grip an object, and raised together to lift it. The whole arm assembly could be rotated about a vertical axis. The whole system was controlled by an 8K Honeywell 316 computer linked to a 128K tie-shared ICL 4130 computer running Pop2 programs.
Pat can't remember if Freddy 1.5 was ever at
Freddy Mark 2., 197? - 1981
Design of Freddy Mark 2: see Barrow and Crawford (1972) above.
Freddy Mark 1.5 was soon upgraded to Freddy Mark 2, with the addition of the ability to tilt the palms about a horizontal axis, and with the addition of force transducers to each wrist, so that the system could deduce the strength of grip, the weight of objects held, and whether the hand had collided with an object. The ICL4130 was replaced by a timesharing Dec10 computer administered by the Edinburgh RRC on behalf of the SERC. All programs using the facility were written in POP2.
Freddy was decommissioned in 1980 and acquired by the Royal Museum of Scotland (where he has been on show). He was replaced by a Unimation PUMA robot.
Systems using Freddy Mark 1.5 & 2
1. Parcel packing, object recognition and other tasks.
Michie D, Ambler A P, Barrow H G, Burstall R M, Popplestone R P, Turner K.
Vision and Manipulation as a Programing Problem. Proc. 1st Conference on Industrial Robot
2. Versatile Assembly Program
Ambler A P, Barrow H G, Brown C M, Burstall R M, Popplestone R J. A versatile system for computer controlled assembly. Artificial Intelligence 6(2), 1975.
In this system, Freddy was taught to recognise simple wooden parts, in each of their stable orientations. He did this by finding their outlines, learning these as descriptions of edges and corners. (Region finding followed by edge finding). A graph matching system was used in this recognition. A box of parts would be thrown on to Freddy's table, and then he would move the table around looking for the outline of objects that he could recognise. He would pick these up, and go back to the pile, and look for something else he could recognise. Failing that, he would push his hands through the pile to break it up a bit, and then look for something to recognise. When everything had been found, he would assemble the car and the ship, using touch sensors in his hands and a collection of high level procedures for programming the arm movements and use of sensors (such as 'fit peg in hole').
Eric Lucey made a film (in 1973) of Freddy finding, sorting and assembling the parts of a toy wooden ship and car. This can be viewed online: (http:/groups.inf.ed.ac.uk/vision/ROBOTICS/FREDDY/freddyII.mpg). Harry Barrow did the commentary.
3. Robot programming language.
a) Ambler, A P & Popplestone, R J; Inferring the Positions Of Bodies From Specified Spatial Relationships, Artificial Intelligence, 6, 1975.
b) Popplestone, R J, Ambler, A P & Bellos, I; RAPT: A language for describing assemblies. The Industrial Robot 5, No 3, 1978.
c) Ambler, A P, Corner, D F & Popplestone, R J; Reasoning about the spatial relationships derived from a RAPT program for describing assembly by robot, Proc. IJCAI, 1983.
d) Cameron, S; A Rapt Picture-scrapbook, Dept. Artificial Intelligence, Edinburgh, Working Paper 106, 1982.
e) Yin, B, Ambler, A P, Popplestone, R J; Combining Vision Verification with a High level Robot Programming Language. Department of Artificial Intelligence Research Paper 247, (198?).
These papers describe the development of an 'object level' robot programming language, in which the desired positions of the palms of the robot hand, and of the objects to be manoeuvered and assembled, could be described in terms of spatial relationships, such as 'put the left palm against the left side of the block', 'align the hole in the block with the axis of the peg', 'move the block vertically'. The computational system then decided what this meant in terms of the actual positions of the various objects, including the robot hands. The first version of the system (described in ref a) used an algebraic method for describing these spatial relationships, and an equation solving system for determining the absolute positions. The second version (described in ref c) used a geometric system of description, and constraint reasoning for determining the required positions. The objects involved in the assembly process were all described using a solid modelling system ( ref d). A later version of the RAPT language included the use of vision and potentially touch sensing. (ref e).
Eric Lucey made a film in 1980 of Freddy assembling a bench mark task under the control of the RAPT system. The commentary was provided by Robert Rae with his good Scottish accent. (Pat hopes she has a copy of this (as a video) somewhere, but she has not located it yet).
4. Use of structured light
Popplestone, R J, Brown, C M, Ambler, A P & Crawford; Forming Models of Plane and Cylinder Faceted Bodies from Light Stripes, Proc. 4th IJCAI, Tbilissi, 1975.
In this system objects on Freddy's table were moved under a stripe of light in order to obtain information about the planes and cylinders from which they were formed.
Odds and ends.
People involved in the development and use of Freddy included (as Pat remembers): Donald Michie, Robin Popplestone, Steve Salter, Harry Barrow, Gregan Crawford, Rod Burstall, Ken Turner, Chris Brown, Bob Beattie, Steve Cameron, Yin Baolin, Anastasia Koutsou, Ilona Bellos, Roy Featherstone.
Freddy was replaced in 1980 by a Unimation PUMA robot.