Hello All! We are now officially in June and with less than two weeks to our provisional launch date we are beginning to finalise our design and construction.

I spoke with the Department Safety Advisor, Andrew Paice, and he informed me that transporting the helium ourselves would be too difficult from a safety standpoint, at least not in the short time frame anyway. With that in mind, I have confirmed with my contact at the BOC Group that we can purchase high-quality helium and have it transported to and back from our launch site (by BOC) thus taking the weight off our shoulders. Although the valves of the available regulators are far too small for our balloon diameter (7.8cm accurate to plus or minus 0.4cm) and therefore we will need to use a hose to channel the gas. I have also emailed our contact at Churchill Cambridge in relation to what materials they will provide (if any) for our launch (i.e gloves, nylon string, duct tape etc) and am waiting to hear back.

I have been working with Diyar and Shawn, trying to successfully create live graphs to display sensor data on our website using JavaScript’s D3js library, although Diyar and I have put that work on hold for the time being due to other more essential features that we need to dedicate more effort to. However Shawn has been continuing to work on displaying animations that convey our relevant sensor information in a way that is engaging for the user. He has been working with the animation program “Blender” which uses complex mathematical tools such as KeyFrames and F-Curves in order to create animation deformation and so fourth. Our hope is that these graphics can be used to, for instance, display a man shivering with snow perhaps falling in the background when the temperature surrounding our probe is below a certain threshold. We believe this would be a great and unique way to represent our sensor data in a more engaging manner.

 Static Blender model of helium-filled balloon on the ground

Static Blender model of helium-filled balloon on the ground

In other news, Wei and Max have uploaded the first two of many tutorials onto the website! Wei’s tutorial is an in-depth walkthrough on the tracker in terms of the itinerary of key components and add-ons (LoRa and sensors etc.) as well as their installation with regards to the Jessie Lite and Raspbian Lite operating systems. Wei will upload another tutorial very soon about the OLED setup once he has got it working reliably. Max’s extensive tutorial covered the troublesome topic of decoding the radio transmissions. It goes into depth on how to use the FUNcube Dongle Pro+ as well as the SDR# and dl-fldigi programs to receive, display and decode the radio signals from the PITS board respectively. Max will be posting more tutorials in the very near-future with the first one coming up covering the uplink/downlink. Yihan has also sent me his GPS data mapping tutorial which will go up very shortly!

Regarding the OLED setup, Yihan found a library for the display from the Adafruit website. Wei has been working with this library, but because it was designed for an older model and because there is rather poor documentation available, there have been numerous issues with trying to get the text properly aligned on the screen. However it is very close to being sorted as of now. He has also been working on writing custom python files to interface with the screen as well as modifying the PITS software for the camera. The idea is that we would like the screen to change the tweet displayed after a picture has been taken. We are confident that this can be done by the middle of next week at the latest.

- Peter