Friday 25 November 2011

Invited Talk in Tampere


Early in November, I was invited to give one of the keynote talks at the International Symposium on System-on-Chip, which is held anually in Tampere, Finland. My talk was about a fast and accurate simulation model that I have developed for a specific Network-on-Chip architecture. By choosing that architecture, I could exploit its time predictability and create a very abstract model of the on-chip interconnect without too much accuracy loss. The results I have presented show an increase of more than 1000x on simulation speed, with less than 10% accuracy loss. Existing work has achieved only a fraction of the speed-up that we could demonstrate. A paper with the first details about this simulation model was published in the DATE 2011 proceedings and is available in IEEE Xplore.


Besides the invited talk, I have also given a tutorial on using UML and its extensions to model and validate multiprocessor embedded systems. In that tutorial, I've covered some of the topics I cover on the System Specification part of York's EDI module, and also some of the results of the research I am doing within the EU-funded MADES project.


I'd like to thank Jari Nurmi and Sanna Määttä for the invitation and for hosting me in Tampere again. Sanna, who worked for a year in my research group when I was in Germany and has recently finished her PhD under Jari's supervision, appears in the photo below introducing my talk.





Tuesday 19 July 2011

ReCoSoC 2011


Last June, I participated in the International Workshop on Reconfigurable and Communication-centric Systems-on-Chip (ReCoSoC), which took place in Montpellier, France. This was the sixth in a series of workshops that I helped to start back in 2005, together with Gilles Sassatelli and Manfred Glesner. It grew out of a cooperation project between TU Darmstadt (my former university) and the LIRMM laboratory in Montpellier, France. The main goal was to have a forum for academics doing research in Systems-on-Chip that are reconfigurable and/or communication-centric.


Let me explain those terms one by one. Systems-on-Chip (usually known as SoCs) are a specific implementation style for embedded systems. Like all embedded systems, they have specific functionalities (usually related to a particular application domain, such as robotics or telecommunications), but they aim to integrate all that functionality on a single silicon die -- the chip. SoCs can integrate one or several processor cores, different types of memory, hardware accelerators (e.g. video codecs, cryptography engines), radio-frequency controllers, sensors (e.g. accelerometer), among other components, all on the same chip.


As SoCs grow complex, the on-chip interconnect structure becomes more and more important. After all, the multiple cores and components listed above must communicate with each other and with the external world. The “communication-centric” part of ReCoSoC deals with the challenges of designing and evaluating different types of on-chip interconnects. A recent development in that area is the concept of Networks-on-Chip, which I will cover in another blog post.


Even though SoCs can integrate plenty of functionality, there is always a cost associated to it. Therefore, it may not be efficient to have a particular function during the whole lifetime of the system, specially if it is used only in specific scenarios (e.g. video codecs are only needed when recording or displaying video). On the other hand, there are some functions which may be replicated during the lifetime of the system because the demand is too high to be satisfied by the initial resources. The concept of changing the SoC’s logical structure after it has been deployed is called reconfiguration, and this also a major topic in ReCoSoC.


This year’s ReCoSoC was the most successful so far, with more than 80 research paper submissions. A committee of 42 researchers evaluated all submissions (each submission was reviewed by 3-4 committee members) and selected 29 of them to be presented as full papers at the workshop. Another 21 were considered work in progress and were selected to be presented as posters. The workshop had more than 80 attendees, and included a number of invited talks beside the accepted papers and posters. My favourite talks were:



  • the keynote on Embedded Compilation and Virtualization by Christian Bertin (from STMicroelectronics);
  • a paper on Asymmetric Cache Coherency by John Shield, Jean-Philippe Diguet and Guy Gogniat (all from Uni Bretagne Sud, FR);
  • a paper presented by my former PhD student Luciano Ost (now working at LIRMM), providing an overview of our design flow for NoC-based embedded systems and highlighting some of his latest contributions on dynamic task mapping;
  • two papers describing optimisations for NoCs with time-division multiplexing (TDM) presented by Daniel Vergeylen and Angelo Kuti Lusala (Uni Cath Louvain, BE);
  • a very interesting paper about the possibility of transmitting data in FPGAs using temperature variations (which has interesting implications to security and privacy). It was presented by Taras Iakymchuk and Krzysztof Kepa (from Nokia Siemens, PL);
  • a poster on self-organising processing elements used to control mobile robots, presented by Laurent Rodriguez (ETIS-ENSEA, FR).



Another interesting point about this year's ReCoSoC is that it took place at the Faculté de Médecine in Montpellier. It is one of the oldest medical schools in the world, established in 1180 (Nostradamus studied there in 1529!) We were all amazed to see presentations about modern multicore embedded systems on a lecture theatre built centuries ago to teach anatomy… it still has an ancient dissection table with an oval marble top! (see the video below, which I made during one of the talks)






In June 2012, we will organise ReCoSoC here in York. We hope to keep up with the excellent quality of the technical contributions, and to have again a history-rich environment for the discussions, this time provided by the ancient city of York. I’ll post more information here as it becomes available.

Thursday 16 June 2011

Competition photos

Here are some photos I took during one of the problem-based sessions I’ve mentioned in the previous post. This was a competition between two groups of students, and the goal was to write a communication protocol that would enable motes to transmit a message across the Computer Science building, from the CrossRail lab to my office. The previous post did not emphasise that the students had to build such a protocol from scratch using only the PHY layer primitives provided by Mote Runner. They had only two hours to analyse the problem, code the protocol, test it, download the code to the motes, decide about their placement and finally deploy them. Not easy. But both groups eventually achieved the goal, but one of them did it first and won a box of chocolates.






Protocol programming and testing




Mote programming





Mote deployment



Winning team

Friday 20 May 2011

Wireless Sensor Networks

The academic year 2010-2011 saw major improvements to the Embedded Systems module (see previous post). One of such improvements was aimed to reflect the increasing importance and widespread use of wireless sensor networks (WSNs). Such networks are a specific type of distributed embedded system, and have been applied to application domains such as habitat monitoring, earthquake prediction and smart homes. WSN nodes (also known as motes) are thumb-sized computers that operate on batteries and sense the environment around them. Motes can cost between £50-£250 each (depending on their wireless radio standard, processing/storage capabilities and packaging), plus additional costs for specific sensors (temperature, light, sound, etc.)

To provide students with practical experience on the development of such systems, I submitted a request to the university's Rapid Response Fund to finance the purchase of a set of wireless sensor nodes (with matching funds from the CS department). I was really happy to see that both the university and the department have such mechanisms to enable small but high-impact initiatives that can improve student experience. It was very lightweight, no bureaucracy, and in a few weeks I've got the notification that the request was granted.

I had decided to purchase a IRIS Classroom Kit, which includes 30 motes, 20 light/temperature sensors and 10 USB data acquisition boards (for programming and debugging on lab PCs), respective software and documentation. After the delivery of the kit, I tested the motes and started to prepare my laboratory activities and experiments. To simplify the programming of the motes by the students, I chose Mote Runner, a new software package that had been recently released by IBM within their alphaworks. In an agreement with IBM's research center in Zurich, York became one of the first universities worldwide that could use that software package for teaching and research purposes (alongside ETH Zurich). I will probably write more about Mote Runner in another post, but here's some information coming directly from the source:



I've prepared six 2-hour lab sessions covering different aspects of WSN design, including system specification, embedded software programming, networking protocols, energy efficiency, system simulation and deployment. Some of the lab sessions included simple exercises to familiarise students with the hardware and software infrastructure, while others followed a problem-based learning approach.

In one of the problem-based sessions, students were divided in two groups that were given 10 motes each, and were asked to create a simple packet forwarding protocol to transmit a message over the Computer Science building, from the Crossrail Lab (where the lab sessions took place) to my office. Once both groups had the protocol implemented and the motes deployed, they should call my office from the CrossRail lab and ask for the message to be transmitted (actually a sequence of colors that should be displayed by blinking LEDs on a mote). The message would then be loaded to a base station mote, which in turn should forward it to other nodes, successively towards the destination. The first group to complete the mission was awarded a box of chocolates.


Part of the assessment of the students (their exam!) was also based on Mote Runner and the IRIS motes. This time, the problem involved indoor localisation of a mobile mote using radio signal strenght obtained from six stationary motes. Students were evaluated on the accuracy of the localisation algorithms they implemented, as well as the memory footprint and energy efficiency of their solutions. They actually did well!


At the moment I am thinking about some new problems that I can use to assess the students taking the Embedded Systems module in the next academic year. But I can't give you any details now... my students would be really sad if I ruin the surprise :-)

Monday 9 May 2011

Embedded Systems Design and Implementation

In line with the profile of the department (as discussed in my previous post), we are often improving the curriculum so that we can graduate professionals that are proficient in both software and hardware parts of a computing system. A specific type of computers are the so called embedded systems: computers that have a specific functionality and must perform them very efficiently, often as part of a larger system or network. Examples of embedded systems include a mobile phone, the motor controller of an industrial robot, a car's anti-lock braking system or the digital audio effect rack used by recording studios. It is estimated that 98% of all computers are actually embedded systems (yes, that's right, all PCs, laptops and servers account for only 2% of all computers!)

For many years already, our Computer Science degree programme has included a module called Embedded Systems which covers basic and advanced topics in that area. It makes sure that our graduates learn to fine-tune a computer system so that it can fulfil the requirements of specific applications and functionality. Recognising the importance of that area and its wide range of applications, we expanded that module, which is now called Embedded Systems Design and Implementation. It focuses on the complete engineering flow: specification, design, implementation and validation. Additional content and new lab sessions were added, so that students have a chance to learn and practice the following skills:


  • specification of functionality at system level (unified hardware/software view)
  • different approaches to implement functionality in hardware or in software 
  • embedded software implementation 
  • wireless communication design
  • on-chip communication design
  • custom hardware design using FPGAs

I'll post more details about some of those topics soon.

Friday 6 May 2011

Hardware vs. Software

Computer Science students here at York know that they must be proficient in both hardware and software. From the moment they arrive to the time they hand in their graduation projects, they are exposed to a healthy mix of lectures and labs that cover both domains as well as their mutual dependencies. This is a consequence of the department's commitment to research in embedded, real-time and safety critical computer systems, which require careful co-design of the hardware and software subsystems. 


I was pleased to find out that prospective students are also aware of that. A few weeks back, I was interviewing a candidate for one of our undergraduate degrees, and when I asked why did he apply to study at York he answered "because York Computer Science students do a lot of hardware stuff". I was curious about that answer, so he told me that very few Computer Science departments have so many hardware labs and advertisement flyers with photos of students playing with robots, circuit boards and osciloscopes.


That got me thinking how the perception of a software vs. hardware dichotomy could influence someone's choice of a higher education institution or degree programme. But considering that usability, performance and energy consumption are the key issues in computing these days, I wonder who would benefit from concentrating only in hardware or software when it is their interplay that will determine the success and wide adoption of a computing system. Will someone use an Ipad if it is slow, runs out of battery in 20 minutes and has a poor user interface? Will they use it if only two out of the three issues are solved? No, you need a design that solve all three problems, and you cannot do that with software or hardware alone. And therefore I think we should be happy to be known as the Computer Science department where students also learn a lot of "hardware stuff".