Launch!

Launch!

We finally launched our probe on a lovely sunny Saturday! We were incredibly lucky in many ways; almost running out of fuel in our chase-car, needing 3x the originally calculated amount of helium due to the flight-predictor suggesting we might fly into Heathrow airport and having just enough due to a delivery error for the helium, and it being one of the only sunny days in a week of thunderstorms!

After a successful but painful-on-the-hands launch, the probe lost radio contact mid-way during the flight, but luckily the HABHUB network of radio dishes could track our payload for us as we chased it across the country. We had the last-known location on our satnav as we travelled towards Hockley. As we drew closer, we regained radio contact with our portable Yagi antenna! We managed to pinpoint the latitude and longitude and set off on foot. We were worried it may have landed in the road, but Wei spotted our plucky Sentinel's bright orange colouring through a hedge, in the middle of a field.

We retrieved it and were amazed to find it perfectly in tact, with hardly a scratch, and all the electronics in working order.

Here is our final launch video that we showed during our final project presentation. We hope you enjoy it as much as we did!

 

FIFTH WEEK PROGRESS UPDATE

Hello All! We are nearing the big event, that is the launch of our probe, which is scheduled for liftoff early next week! I have already run some HabHub landing predictor models which show the probe bursting anywhere from Ipswich to Milton Keynes depending on the time of day so we will have to be very attentive with respect to our timing.

HabHub Flight Predictor trial

HabHub Flight Predictor trial

Map of launch site at Churchill Cambridge (Chart is courtesy of Ordnance Survey © Crown copyright 2008)

Map of launch site at Churchill Cambridge (Chart is courtesy of Ordnance Survey © Crown copyright 2008)

I am still in contact with the BOC Group to finalise the dimensions of the regulator however we have decided to go with the 528-G10 Helium Balloon Gas (Safety Datasheet: http://www.boconline.co.uk/internet.lg.lg.gbr/en/images/tg-8312-helium-v1.4410_39593.pdf?_ga=1.133572108.306201186.1464085511). 

I have spoken to the Cambridge University Space Flight society who have confirmed that they will provide duct tape. We need to bring a 1mm nylon cord (they suggested http://randomaerospace.com/Random_Aerospace/Stuff.html) however in our unboxing we received some nylon string already so we don’t need to purchase anymore. Finally the team mentioned that we don’t need to bring gloves.

Diyar, Max and I have been working on the physical construction of the probe, which involves a lot of sanding and fine-tuning of the polystyrene! We have successfully fitted one of our insulation discs onto the largest hemisphere and are considering adding an additional disc for extra thermal protection. We have also fitted the GoPro onto the top hemisphere although we may need to redo this as the polystyrene is slightly corrupted in places.

Top hemisphere of probe housing GoPro camera

Top hemisphere of probe housing GoPro camera

Largest hemisphere of probe with first insulating layer fitted ideally

Largest hemisphere of probe with first insulating layer fitted ideally

We have continued to add content to the website on two separate notes. Firstly Diyar, Wei, Arshan and Max recorded and uploaded our first podcast to the ‘Feed’ page. The podcast details the purpose of our overall mission, a technical overview of how our probe functions and any details regarding contacting the probe when in space. Secondly Wei uploaded his second tutorial which is on the build of a LoRa gateway. The LoRa gateway is primarily used to receive and decode the messages sent by the tracker but Wei added extra functionality to allow us to upload messages to the LoRa tracker as well (i.e. sending tweets to the OLED display).

Max has been working on the animation for the post flight replay using Blender and Unity3D. Shawn modelled the balloon, parachute and payload using Blender and Max imported it to Unity to animate, having heard of the hardships of F-curves from last year’s group. So far they have the basic functionality up and running; tweets can be displayed, the balloon moves and weather animation plays, all timed according to the timestamps of our sensor data and tweet bank as read from a file. We are hoping to add an interactive timeline and functionality to display images (today).

Probe animation with chilly environment demonstrated via snow graphic

Probe animation with chilly environment demonstrated via snow graphic

Wei ran into some issues with the previous OLED in that it glitched whenever it was being run together with the tracker software. At the start of the week Wei managed to write code and modify the tracker software to change up the OLED display after every picture taken, but the OLED ended up malfunctioning because the tracker uses the same serial interface as the OLED. So we have ordered a new one (which uses a different interface) and Wei has been tinkering with it. We are happy to report that it does work together with the tracker software, and now Wei just needs to modify his code to accommodate the new OLED.

Ash, Max and Wei have also been working to figure out how to get the Raspberry Pi camera to focus on the text on the small OLED screen and have ordered a variety of lenses to experiment with.

- Peter

FOURTH WEEK PROGRESS UPDATE

Hello All! We are now officially in June and with less than two weeks to our provisional launch date we are beginning to finalise our design and construction.

I spoke with the Department Safety Advisor, Andrew Paice, and he informed me that transporting the helium ourselves would be too difficult from a safety standpoint, at least not in the short time frame anyway. With that in mind, I have confirmed with my contact at the BOC Group that we can purchase high-quality helium and have it transported to and back from our launch site (by BOC) thus taking the weight off our shoulders. Although the valves of the available regulators are far too small for our balloon diameter (7.8cm accurate to plus or minus 0.4cm) and therefore we will need to use a hose to channel the gas. I have also emailed our contact at Churchill Cambridge in relation to what materials they will provide (if any) for our launch (i.e gloves, nylon string, duct tape etc) and am waiting to hear back.

I have been working with Diyar and Shawn, trying to successfully create live graphs to display sensor data on our website using JavaScript’s D3js library, although Diyar and I have put that work on hold for the time being due to other more essential features that we need to dedicate more effort to. However Shawn has been continuing to work on displaying animations that convey our relevant sensor information in a way that is engaging for the user. He has been working with the animation program “Blender” which uses complex mathematical tools such as KeyFrames and F-Curves in order to create animation deformation and so fourth. Our hope is that these graphics can be used to, for instance, display a man shivering with snow perhaps falling in the background when the temperature surrounding our probe is below a certain threshold. We believe this would be a great and unique way to represent our sensor data in a more engaging manner.

Static Blender model of helium-filled balloon on the ground

Static Blender model of helium-filled balloon on the ground

In other news, Wei and Max have uploaded the first two of many tutorials onto the website! Wei’s tutorial is an in-depth walkthrough on the tracker in terms of the itinerary of key components and add-ons (LoRa and sensors etc.) as well as their installation with regards to the Jessie Lite and Raspbian Lite operating systems. Wei will upload another tutorial very soon about the OLED setup once he has got it working reliably. Max’s extensive tutorial covered the troublesome topic of decoding the radio transmissions. It goes into depth on how to use the FUNcube Dongle Pro+ as well as the SDR# and dl-fldigi programs to receive, display and decode the radio signals from the PITS board respectively. Max will be posting more tutorials in the very near-future with the first one coming up covering the uplink/downlink. Yihan has also sent me his GPS data mapping tutorial which will go up very shortly!

Regarding the OLED setup, Yihan found a library for the display from the Adafruit website. Wei has been working with this library, but because it was designed for an older model and because there is rather poor documentation available, there have been numerous issues with trying to get the text properly aligned on the screen. However it is very close to being sorted as of now. He has also been working on writing custom python files to interface with the screen as well as modifying the PITS software for the camera. The idea is that we would like the screen to change the tweet displayed after a picture has been taken. We are confident that this can be done by the middle of next week at the latest.

- Peter

THIRD WEEK PROGRESS UPDATE

Hello all! This week we are making significant progress towards the successful launch of our probe as we quickly near June!

Firstly I have been speaking with Dr Tate (Lab Facilities Manager) who has informed me that he is in possession of a canister of 50 bar, size V cylinder, CP (constant pressure) helium as well as a fully functioning regulator. Size V is 1.8 m^3, so this is approximately 90 m^3 at one atmosphere which is the standard pressure (and is sufficient for our needs). I also just confirmed with Mr Luke, the Department Safety Advisor, that this canister of helium is below the threshold for full compliance with the European Agreement concerning the International Carriage of Dangerous Goods by Road (ADR).

The ADR specifically takes into account the water capacity of the cylinder in the calculation (water capacity in litres x no. of cylinders = number of points) with the full regulations kicking in at a threshold of 1000 points, which we are thankfully nowhere close to. The final step in acquiring the helium is confirming that we will use a transport vehicle that is insured for occasional business use.

I have also been working on the website’s content in three different regards. Firstly I have added information about the polystyrene insulation to the ‘Sentinel’ page. As I mention, the Stratosphere’s temperature distribution is definitively unique due to the presence of the Ozone Layer at approximately 30km and therefore implementing proper insulation is essential in order to protect our equipment from sub minus 40 degrees celsius temperatures. Further to this, Diyar has been working with AutoCad software in order to build some sample designs for our probe housing. The second aspect of the website that I am now working on is experimenting with incorporating JavaScript’s D3js library into the HTML code in order to attempt to create live, interactive graphs of our sensor data on the website. If this is not possible directly then we can host the library on a different server and then imbed the link on our main site. Finally I have added a ‘Tutorials’ page to the website which Wei and Max will post in-depth technical walk-throughs on as early as next week!

In other news, Yihan made a video to display a travel path on google maps. From data fetching to information analysis and conversion, KML demonstrations and travel display, he has now finalised all research regarding the manipulation of GPS data and KML. In addition to this, Yihan contacted Adafruit Monochrome (electronic manufacturer) about the availability of large size, organic light-emitting diodes (OLEDs). We use OLEDs because they are able to work in certain quasi-space conditions, such as low temperature and pressure. However since the size of the OLED that we wanted is not currently available, Yihan has enquired as to whether or not the manufacturer can find one and deliver it to us in a reasonable time. Moreover, Yihan has assumed the responsibility of bookkeeper, managing our budget account. He has made an Excel spreadsheet which indicates that as of now, we have a £202.84 available budget. 

Max, Ash and Wei have been working on the uplink and they are very happy to report that it is working! Both the balloon Pi and the ground Pi are operationally ready (save for the OLED display). We are able to periodically pull GPS coordinates and sensor data off the balloon in addition to pictures taken with the Raspberry Pi camera. Simultaneously, we are able to tweet messages at the balloon with the keyword “MQTT”. We have set up Node-RED in Bluemix such that it searches for the keyword and pushes the message to the ground Pi. The ground Pi then sends up the data to the balloon Pi via LoRa and the message is displayed on our console. The next step would be to display the received messages on the OLED display. The team will crack on in this respect once the OLED display arrives!

- Peter

Information received through LoRa and sent to twitter (displayed below in second picture) and vice versa (UPLINK message)

Information received through LoRa and sent to twitter (displayed below in second picture) and vice versa (UPLINK message)

Display of ability to tweet pictures, embedded maps, coordinates and sensor data through use of LoRa

Display of ability to tweet pictures, embedded maps, coordinates and sensor data through use of LoRa

SECOND Week Progress Update

Hello all! We are reaching the end of our second week and strong progress has been made! Firstly Max and Arshan have managed to parse the GPS coordinates from the decoder software output text file and send them to IBM’s IoT Bluemix node via a Python MQTT script. After coming onto Bluemix the messages are routed to Twitter and Arshan and Max are now experimenting with routing messages conditionally, based on incoming Tweets from people following the project, using custom Javascript functions within Node-RED.

Yihan has been working on mapping the GPS data received from the Pi in the Sky (PITS) and manipulating it via a different software so that the location information can be presented on the screen in a more accessible and intuitive manner. He has successfully converted raw data (plate text file) that is received, to a more commonly used data type, i.e. GPX or Keyhole Markup Language, known as KML. With those GPS files, we are able to map and display geographic information on map viewer software, for example Google Earth. He is now working on how to connect the individual location points into a travel path in order to track our flight and make a video on Google Earth so that we can perform a comprehensive outlook of our discovery journey.
 
The screen snippet ‘Capture2’ is the first GPS reading we received when we calibrated our sensors in the Electrical Engineering Department at Imperial College. Test point 2 is the original data but point 3 is the point when we deliberately raised its height.
 

Capture 2 - First GPS Reading during calibration of sensors

Capture 2 - First GPS Reading during calibration of sensors

The screen snippet ‘Capture’ is the path we linked on two peaks. It demonstrates that we can link any points with a 3D display and present it on a map.

Capture - 3D Path between two points in question

Capture - 3D Path between two points in question

Wei has been working hard with the assistance of me and Diyar in integrating the sensors with the Jessie Lite interface. The light sensor is still playing up but we have replaced the magnetometer/accelerometer and now we are getting much more accurate readings! Wei also has successfully been able to operate and extract readings from all the sensors at the same time while they are all connected on the same breadboard which has been strong progress. The next step is to be able to use the ‘Transmit’ C code with the successful inclusion of the Python library in order to send the sensor readings across to another terminal so as to upload to Bluemix. 

On an equally positive note the LoRa board has arrived! The LoRas are being used mainly for two-way (half-duplex) communication between the ground and the balloon. This enables us to send tweets up to the balloon and the balloon can send telemetry and images down to the ground. Configuration and installation have been successful on both the transmitting and receiving Pis. Now we need to test if useful data can be sent over the Pis since LoRa transmissions cannot be decoded by our SDR. This should be done by the end of the week if everything goes smoothly. 

I have been working with Diyar on the website, building functionality and design. We decided to merge the ‘Sentinel’ page with the ‘Mission’ page in order to streamline the experience for users. We also added several images of the sensors and GPS dongle and included relevant explanations and Diyar added images to the 'Team' page. We decided upon the URL: www.projectedge.net and the website has been officially launched!

On a final note, the Chemical Engineering Department has not gotten back to us yet and furthermore their store doesn’t officially stock any type of gas. Therefore the best course of action will be to purchase a large canister of helium from the BOC Group at Imperial for a cost of around £120.

- Peter

 

Usmaan:

I have noted that it is no longer necessary to have a decoding procedure in order to retrieve the transmitted images. Therefore this has allowed me to focus on the cognitive and data processing aspects of the project since retrieval is no longer a hurdle. I have been considering comparing the received data to a geographical database of information about Earth. On comparison with this information, the probe may tweet quantities relatable to Earth-dwellers of what it is experiencing, for example: “Feels like riding Stealth, Thorpe Park up here!” if the G-forces experienced are within a certain error of this ride. The same principle may be applied to luminosity, pressure and temperature readings, for example: “Feeling the pressure drop of climbing Kilimanjaro x many times!” etc. The challenge now is to implement Watson’s APIs optimally in order to achieve the functionalities described above in style!

first Week Progress Update

Screenshot of program successfully decoding coordinates (in green) received by GPS dongle

Screenshot of program successfully decoding coordinates (in green) received by GPS dongle

Hello all! We are now at the end of the first week of our group project and progress has been strong. We have managed to decode the coordinates sent out by our sentinel to the GPS dongle and store them on a text file. Diyar and Wei have engaged in some perspiring soldering work and are now configuring the different sensors that we have available (they have done the accelerometer thus far). The website is coming along well and we plan to continually add content in many different regards. Unfortunately the LoRa board is out of stock and the supplier says that he will endeavour to send it to us ASAP. We will look into different suppliers over the weekend. 

Pi In The Sky (PITS) is also now up and running, albeit with a very weak signal which we believe will be solved with the LoRa unit. Max is working on MQTT and is exploring how this will interact with Bluemix to convey the information that we want. Regarding the decoding of the images, Usmaan has noted that on reception of information transmitted from the Pi, we are required to perform a decoding operation from ASCII text to an image. If possible we would ideally like to implement this in Python. Our main challenge in this operation is parsing large quantities of information to obtain the desired text.

Finally I have heard back from The BOC Group (a body that supplies Imperial College with gas needs) and they have confirmed that we can in fact purchase a large canister of 200 bar, 99.995% helium and store it safely on the South Kensington Campus. However we will still contact several technicians in the Chemical Engineering Department to see if we can get the product for cheaper than £100 and Diyar has been speaking to another contact at IBM who mentioned that they may also be able to provide us with the much needed gas.

- Peter

Our Visit to IBM Hursley

Our Visit to IBM Hursley

This Wednesday our group had the distinct pleasure of visiting IBM Hursley, IBM’s research and development lab that resides in the leafy-green pastures of Hursley village, Hampshire. Our day started at approximately 1pm and after navigating our way through the scenic complex we finally arrived at our destination in G Block, the Galileo Centre.

Green pastures out in front of the IBM Hursley complex

Green pastures out in front of the IBM Hursley complex

After being checked in at reception we got to meet John McNamara who we had briefly conversed with a few days earlier on Google Hangouts. He showed us around the Galileo Centre and introduced us to the very wise and friendly, quintessential computer scientist, Steve Upton. 

Steve gave us a brief but insightful introduction into the high-level view of the software architecture topology known as microservices. This software architecture relies on breaking a larger project up into smaller, self-contained apps. This allows the programmer to fold the inevitability of human error into their design and make occasional adjustments to certain apps that have failed. The example that Steve gave was that if Netflix has a self-contained login page that fails, it wouldn’t directly impact another person watching Breaking Bad. This faulty login page could then be updated when it is fixed and put back into the live system without the other ‘boxes’ needing to be updated as well. This type of architecture also has benefits in regards to teamwork as it allows for different members to work on different aspects of the project independently yet cohesively as an unit. Steve mentioned that Netflix has a specific piece of software called the “Chaos Monkey” that deliberately breaks different ‘boxes’ in order to test the ability of an entire system to handle error.

This software design could potentially be very useful in our project considering that if one aspect of our probe fails (eg. twitter feed), we would like to be able to adapt as nimbly as possible so as to keep the remaining systems functioning. The intermediary of these different microservices that was recommended by Steve was MQTT messaging (implemented through Bluemix) which uses the efficient publish/subscribe data management system (although I will not get into too much technical detail in this blog post).

After the microservices explanation, Steve brought us up to the IoT room and gave us a live demonstration of the Bluemix software using the Node-RED platform, showing us the ease and versatility with which we can implement the microservices architecture on our own project. Around the IoT room there were also temperature sensitive lights and a rotating floor which should be able to ‘sense’ our probe’s temperature and spacial conditions respectively when it is in flight.

IMG_0974.JPG

Finally we ventured past the company cricket field and over to the clubhouse where we had a relaxing pint of Guinness with John. We discussed some more of the project logistics including the acquisition of Helium as well as the transportation of the probe to the launch site in Cambridge. We left Hursley at approximately 4pm and set off into the green meadows to unbox our kit and start building the “Space Sentinel” probe.

- Peter

Introduction to our Blog

Hello Everyone! From now until our much anticipated launch, we will be documenting the journey of our project. Firstly, we have provisionally planned our launch date for the 10th June, that is before adjusting for weather conditions and availability from Churchill at Cambridge etc (who have just recently given us the go ahead!).

Our group had its first meeting in which we discussed the fundamental layout of the project. We split up into smaller teams in order to efficiently tackle the very first stages of research. Arshan and I (Peter) focused on the Raspberry Pi, PITS and LoRa modules, Diyar, Wei and Yihan focused on the sensors and website design and Max, Shawn and Usmaan focused on the software ecosystems provided by IBM.

We are eagerly waiting to visit IBM Hursley next Wednesday so that we can ask as many questions as possible regarding the probe design, learn more about IBM and their vision going forward and also receive our kit which will allow us to delve a lot deeper technically into the project.

- Peter