Determination of atmospheric ozone concentration through absorption of UV and visible light in an integrating sphere

Research Group: Cameron Brewer, Jason Schilling, and Noah Thompson

Launch: Whitworth Spring 2019

''The goal of this project was to determine the concentration of ozone in the atmosphere as a function of height. Ozone is a molecule formed by radiation from the Sun in the upper atmosphere, and it stops harmful radiation from the Sun before it reaches the surface. Our project utilized an integrating sphere with different wavelengths of LEDs and two light sensors to measure how much light was coming from those different LEDs. We used the AS7265x light sensor and the FDS010 photodiode with a 255nm, 405nm, 600nm, and white LED. As the altitude of the pod increased, there would be more ozone inside of the integrating sphere, and the ozone would absorb some of the light from the LEDs. Then, based on how much the readings from the light sensors decreased, we could determine how much the concentration of ozone in the atmosphere was increasing. However, there were some flaws in our design such as wires with metal that could have overlapped and poor organization of wires inside our pod. The biggest issue was that the AS7265x light sensor got disconnected during the final assembly of the pod, which stopped the code from running, so we did not get any data during the flight.''

Background


Ozone is a gas that is very important to humans and all of life on Earth. If it were in the air around us, we would die from inhaling it, but if it weren't in Earth's atmosphere at all, life on Earth would not be possible because of the radiation from the Sun. Ozone is a gas molecule made up of three oxygen atoms, and it generally only exists higher in Earth's atmosphere. The most important thing about ozone is that it absorbs UV-C light. Humans generally use sunscreen to protect from UV-A and UV-B light, but sunscreen is not strong enough to protect against UV-C light. UV-C light has a shorter wavelength than UV-A and UV-B light, and it is more harmful to humans than other types of ultraviolet light. Luckily for us, there is enough ozone in the atmosphere that UV-C light does not make it to Earth's surface.

When ultraviolet light comes in contact with normal oxygen gas molecules (containing two oxygen molecules), they break apart into molecular oxygen, which can bond together into ozone molecules. When light reaches the atmosphere, high wavelength light passes through and low wavelength light (like ultraviolet light) hits the ozone and scatters, allowing us to see more purple, blue, green, and yellow in the sky. We see mostly blue because it is easiest for our eyes to detect.

The goal of this project is to see how much ozone there is in our atmosphere. More specifically, we will be measuring the concentration of ozone in the stratosphere as a function of height.

Theory


We can measure the concentration of ozone by using the fact that it is efficient at absorbing different wavelengths of light. We will do this by using an integrating sphere. We constructed a sphere with a hole for light to enter, and light sensors in other parts of the sphere. The light enters the sphere, then bounces around inside its reflective surface before being seen by one of the light sensors. The diameter of the sphere is only 4 or 5 inches, which would not be enough distance to measure how much light is being absorbed. However, after bouncing around in the sphere, it will have effectively covered more distance before reaching the light sensor. This allows us to measure how much of the light is being absorbed inside the sphere, which will correspond to the concentration of ozone in the sphere that is absorbing the light.

We made this sphere by taking two cake pans that have the shape of hemispheres and painting the insides white with spray paint (so they will be reflective). There is a hole on the top of the sphere with a PVC pipe endcap that has different wavelengths of LEDs in it, so we can choose what wavelength goes into the sphere. We have a 600nm wavelength LED, 405nm wavelength, 255nm wavelength (UV-C light), and a white LED. The 600nm wavelength LED corresponds to the peak absorption wavelength for ozone in the visible light spectrum, while the 255nm LED corresponds to the overall peak absorption wavelength of ozone (meaning ozone is most effective at absorbing 255nm). Last year had a problem with nitrogen dioxide, which also has a high concentration in the atmosphere. This stopped them from getting reliable data because the absorption from nitrogen dioxide overlapped with the absorption of ozone. 405nm wavelength corresponds to the peak absorption of nitrogen dioxide, so we have a 405nm LED to hopefully measure how much nitrogen dioxide is present. The white LED is to measure how much "noise" there is in our data. Gases like helium, hydrogen, and normal oxygen could affect our data (because all of these gases absorb different wavelengths of light), so the white LED will be a mixture of wavelengths, and we can see when the absorption of other wavelengths starts going down (indicating that the concentration of gases like hydrogen, helium, and oxygen are decreasing). We expect to find that as we increase in altitude, the concentration of other gases decreases while the concentration of ozone and nitrogen dioxide increases.

We have two light sensors inside the sphere. One is able to measure specific wavelengths of light, showing us how much of each wavelength is measured in 20nm wavelength bands. However, this light sensor cannot measure below 400nm wavelengths. We have a separate light sensor that measures 200nm to 1100nm wavelengths, which will be used to measure the 255nm LED. This sensor does not tell us specifically what wavelengths it is measuring, which is why we have the other sensor for the 405nm, 600nm, and white LEDs.

Pod Organization
The integrating sphere is in the middle of the pod with fans on two of the adjacent sides (one sucking air in, one blowing air out), 90 degrees apart from each other to keep the ozone concentration inside the sphere the same as the concentration outside the sphere. We are reusing a pod from last year, as it was in good condition and was easily adaptable to our experiment design. The Whitworth launch computer is positioned in the lid of the pod, with space cut out for it into the lid.

Integrating Sphere


The integrating sphere was made by putting two Fat Daddio’s cake pans, which are hemispheres, together. They were spray painted with flat white spray paint to make the inside reflective. Drilled into the integrating sphere are five holes. Two of the holes are for the fans, one is for the LEDs, one is for the FDS010 sensor, and one is for optical fibers, which travel to the AS7265x light sensor, which measures every LED other than the 255nm wavelength LED. The LED hole is opposite the hole for the FDS010 Sensor, so none of the paint absorbs the UV light, only the ozone inside the sphere. The hole for the fiber optics is roughly 45 degrees from the LED hole to allow the light to bounce and make multiple passes through the sphere for maximum absorption of light by the ozone before being measured. The LEDs and light sensors were specifically positioned such that the 255nm wavelength light would go straight through the sphere to the light sensor while the other wavelengths would bounce around inside the sphere before hitting the light sensor. This is because ozone is very absorbent at 255nm wavelengths, so the 255nm light does not need to travel through as much ozone for there to be a noticeable decrease in the amount of light being detected. However, the other wavelengths must bounce around inside the sphere so that they are effectively covering more distance before hitting the light sensor to increase the amount of light being absorbed by ozone. This is the only way we can detect a noticeable decrease in the amount of light detected at these wavelengths as the concentration of ozone increases. The fans are set in 3D-printed mounts from last year's group, which are epoxied to the outside of the sphere. They are 90 degree apart so that they can stick through the walls of the pod and pull in and expel external air.

Fan & LED Mountings


The fans were set in 3D-printed mounts that are reused from last year’s project. The fan mounts were printed with ABS plastic to minimize ozone absorption. The LEDs are arranged in a ring in a PVC end cap that is attached to the outside of the sphere to allow the light to diffuse before entering the sphere. The light from the LEDs will bounce around inside the PVC endcap before entering the sphere through a small hole to make the light diffuse. The UV LED is positioned at the end of the cap, directly above the hole to minimize the amount of UV light absorbed by the sphere itself. The UV light will be absorbed by ozone much more than the other wavelengths, so it does not need to reflect through the sphere at all. It will pass through the diameter of the sphere straight to the sensor, which will provide enough absorption to detect the changing ozone concentration. If it were to reflect through the sphere, we may not detect any of it because all of it could be absorbed.

AS7265x Optical Fibers
The AS7265x sensor has optical fibers taking light from inside the integrating sphere to the individual sensors on the AS7265x sensor. There are three sensors on the AS7265x sensor, each with one optical fiber going to it. We created a mount for these optical fibers with the laser cutter out of Delrin plastic. It is shaped like a cylinder with a small cylinder on top of it (like a water bottle with a cap on top of it). The small cylinder fits inside the side of the integrating sphere with three holes going through it, just big enough for optical fibers, which were inserted with epoxy. We also used Delrin plastic to make small mounts on each individual sensor on the AS7265x sensor, which are shaped like hollow cubes with one side missing. They have a small hole in the top of them for the optical fiber to fit inside, which is right over the hole leading into the sensor. This takes light from inside the integrating sphere to the AS7265x sensor.

Whitworth Launch Computer
The circuit was set up on the Whitworth launch computer with a battery mounted to the back of it.

LEDs
All LEDs were connected through a resistor to keep them from breaking, and a transistor to give them more current so that they will be brighter. We will have four different types of LEDs: 255nm wavelength (the peak absorption of ozone), 600nm wavelength (the peak absorption of ozone in the visible light spectrum), 405nm wavelength (the peak absorption of nitrogen dioxide), and white (which covers the majority of the visible light spectrum). The group that worked to detect ozone last year had the problem of not being able to distinguish between absorption from nitrogen dioxide and ozone since they somewhat overlap, so we have taken the effort to use specific wavelengths of LEDs and light sensors that can detect specific wavelengths to distinguish between the peak absorption of nitrogen dioxide versus ozone. We are also using a white LED that covers the majority of the visible light spectrum. Only one LED will be on at a time, toggling between them and capturing data with the sensors each time a new LED turns on.

All of the LEDs share a common ground connection. There are two of each color of LED (except for the UV LED), and each color has its own power connection (i.e. the two orange LEDs share a power connection, the two purple LEDs share a power connection, etc.). However, the LEDs each have their own resistor in their own circuit instead of being connected in parallel to maximize the amount of current going through each one.

AS7265x Sensor


The AS7265x sensor is a light sensor with three smaller sensors on it. Each of these three sensors has 6 channels of wavelengths it can detect, so the whole device has a total of 18 channels. Each channel detects a 20nm range of wavelengths, so the device can detect nearly every wavelength in the visible light spectrum, reaching into the infrared wavelengths. The device collects data in each of its 18 channels and records the values from each channel separately, returning a number between 0 and 65,000 corresponding to how much light was detected from that channel. Thus, we can see how much of each specific wavelength is being detected. This will make it easy to distinguish between the purple and orange LEDs, as well as seeing how much of each wavelength emitted by the white LED are present. This sensor was connected to the mbed and programmed using I2C communication. It was programmed using a library with functions we created specifically for this sensor.

FDS010 Photodiode
The AS7265x sensor cannot detect wavelengths in the ultraviolet range, so to detect the 255nm LED, we are using the FDS010 photodiode, which can detect wavelengths from ultraviolet through infrared. However, unlike the AS7265x sensor, it does not say how much of each wavelength it is detecting. We planned to generally use this only to detect how much light was present when the UV LED was on. This sensor creates a voltage and returns a number between 0 and 1, representing what fraction of the mbed's voltage is being created by the light sensor. Therefore, the op amp was used in the circuit with the FDS010 photodiode.

Fans
The circuits for the fans were reused from last year's project, so the circuits are already soldered and set up for us. The fans do not need to be controlled in the code because they will just be running the entire time, so they are connected to an external 9V battery that will keep them running.

Code


Here is a link to our code, which was created in C++ in the mbed browser. Part of the code used came from an AS7265x library created by Noah Thompson and Jason Schilling. This can be found at the same link. All times shown in diagram are in seconds.

LEDs
We used a simple program that toggled between each of our four types of LEDs. This was done without the LEDs in the integrating sphere yet just to ensure that all of the LEDs worked as expected. We needed to wear goggles and gloves to protect our eyes and skin from the 255nm LED. Since we cannot see that wavelength, we held a business card up to the UV LED. The type of paper that business cards are made of glows a blue/purple color under UV light, so we were able to ensure that the UV LED was working as expected. We then put both light sensors and the LEDs inside of a cardboard box together (they were not yet inside the integrating sphere). We ran the same code that toggled between the LEDs and ensured that the AS7265x sensor was able to detect which LED was on at which time, since the sensor measures specific wavelengths of light. The values given by the FDS010 sensor also increased when the LEDs were on as expected. We then tried to find the way to get the LEDs as bright as possible because they would need to be fairly bright for our design to work once they were in the integrating sphere. We looked at the data sheets for each type of LED and calculated the amount of resistance necessary for each one based on the maximum current for each LED and using Ohm's law. It was also determined based on this information that a separate 9 volt battery would work well to power the LEDs. It was determined that the white LEDs would have 330 ohm resistors, the purple LEDs would have 270 ohm resistors, the orange LEDs would have 100 ohm resistors, and the UV LED would have a 25 ohm resistor (this was done by putting two 50 ohm resistors in parallel). We would also be able to make the LEDs brighter by putting each one in a separate circuit instead of having them in parallel with each other so that each LED has more current going through it. So, we ended up with a separate circuit for each individual LED, each with their own resistor.

AS7265x Sensor
The AS7265x sensor was giving higher values in the correct channels when inside the cardboard box with the LEDs, so we knew that it was working as intended. However, when everything was transferred to the integrating sphere, the readings from the AS7265x sensor were too low to get any useful data. The AS7265x sensor returns an integer between 0 and around 65,000 for each wavelength based on how much light there is of that wavelength. When everything was inside the integrating sphere, none of the wavelengths on this sensor were reading above 100. After seeing this, we decided not to use the AS7265x sensor in our final experiment. We would still have it gathering data, but we did not plan to use the data that was gathered because we assumed all of the readings would be very low and not useful to us. Therefore, all of the data we planned to get was from the FDS010 sensor.

FDS010 Photodiode
The FDS010 photodiode was put in the circuit using an op amp. This meant we needed to find the right amount of resistance to get the reading of the op amp where we wanted it. Increasing the resistance of the resistor going from pin 1 of the op amp to pin 2 of the op amp will increase the values read by the photodiode. The UV LED would give us the most reliable data because of how strongly it correlates to the peak absorption of ozone, so we tried to get the right amount of resistance to make the op amp return a value of about 0.75 with no ozone in the sphere. Then, as the concentration of ozone in the sphere increased, the reading from the op amp would ideally decrease to about 0.25, which would be the about the minimum reading in our case. However, we could not get these results with different resistor combinations. To get the op amp to return 0.75 with the UV LED on, we needed to use a very large amount of resistance. We ended up using a 10 mega ohm resistor, which was giving a reading of about 0.35 with the UV LED on, and a minimum reading of about 0.25. This would hopefully still give us some data, but it would not be as sensitive or accurate as we would have liked.

Calibration


For calibration, the entire mechanism was placed in an ozone chamber. This chamber slowly generates ozone and allows for a controlled environment in which ozone can be measured. There is an ozone sensor in the chamber, which would accurately measure the ozone levels, and we wanted to use the readings from this device along with the readings from our light sensors to see what changes in the readings from our light sensors correlated to what concentrations of ozone. However, the device used to accurately measure the concentration of ozone in the ozone chamber needs to run for a full 24 hours before it can be used, and it was not turned on the day before, so we did not have this device available to us when we put our experiment in the ozone chamber. Therefore, we were not able to calibrate our experiment, but we could still use this as validation to see that the light sensor readings were going down as they were inside the ozone chamber. The readings did show a slight decrease as time went on, but it was hard to tell how well it was working because the ozone chamber does not generate a lot of ozone and the materials used to create the sphere and circuit absorb ozone, so there was likely a fairly small change in the concentration of ozone throughout this test. Also, our experiment did not gather data for as long as we wanted it to because the battery died during the test. However, this was an important thing for us to find, because it meant that our experiment was using too much energy and would not be able to run for the duration of the flight when we launched. We disabled some of the indicator LEDs on the mbed that were used for testing purposes, which were likely using up a lot of the battery. This allowed the experiment to run for over four hours.

After the launch, we put the sphere back inside the ozone chamber. The code had been changed slightly since the last test, and an error in the library used for extended timers caused data to only be collected for the first ten minutes of the testing. The code was fixed, and we put it in the ozone chamber for the third time. This time, it collected data for over four and a half hours. However, the data showed no decrease in the values from the light sensors. It is possible that the fan blowing air into the sphere was pushed against the wall of the ozone chamber during the test, so no ozone from the chamber was being taken into the sphere. It is still unclear whether the design of our experiment would be able to adequately measure a change in the concentration of ozone.

Data and Analysis
When we got our pod back from launch, there was no data on any of the files on the SD card. At first, we thought this may have been due to a problem with the battery. During testing, our battery died after slightly more than one hour of operation in the ozone chamber. We turned off some of the LEDs used for testing in our code and hoped that this would let it last for multiple hours during the flight. When we got the pod back after the flight, the battery was dead. Our code was set to load data onto the SD card every 10 minutes (by mounting and unmounting from the SD card), so we thought that maybe the battery failed within the first 10 minutes of the flight. Another possibility was that the wrong code was loaded on the mbed, and possibly the code loaded on it was erasing the data on the SD card.

We checked the file on the mbed and confirmed that it has the same number of bytes as the file we intended to have loaded on it, so this was not the issue. We inspected the battery after the flight and found that the wire used to charge the battery was broken. We tried to charge the battery to see if the code would be able to run, and it said that the battery was already charged. We soldered the wire for charging back onto the battery, and it charged normally. However, the code would not run after charging the battery. The indicator LEDs in our code were still turning on, indicating that it was mounting the SD card, but it still was not recording any data to the SD card. We put print statements in the code at various places to see when it was being stopped, and determined that it stopped at the first command to the AS7265x sensor, which was setting the Bank mode to configure how it collects data. We took the sensor out of the pod to inspect any issues with hardware because the code was working before, so it was likely not a software issue. After taking it out of the pod and putting it back in, seemingly without changing anything, the code worked normally. From this, we concluded that the wires connected to the AS7265x sensor must have become loose at some point during the flight, or the morning before launch when everything was put into the pod with likely what was more force than optimal.

We should have worked more on wire and pod organization to prevent these issues when putting the pod together, and we should have used more heat shrink closer to the end of the wire when soldering. Some of the ends of the wires we soldered were still exposed, so they could have potentially crossed and had metal touching metal, which could easily cause issues in the circuit.

Conclusion
We ended up not getting data from our experiment due to an issue that likely could have been prevented with better time management. There were multiple aspects to our project that could have potentially caused issues, but we ultimately did not get data because of a connection issue with the AS7265x sensor. If we had done more work earlier in the semester, we would have gotten the validation step done earlier and realized that the AS7265x sensor would not get good data for us during the flight. Then, we could have removed the sensor from our design and not had it in the pod during launch day. This would have saved room in the pod and eliminated the possibility of the connections to the sensor coming loose and causing our code to stop running.

We should have included an indicator LED inside of the while loop in our code when the sensors started collecting data. We had one LED to indicate that the battery was turned on, and one LED to indicate that the SD card was mounted correctly and ready to record data. Unfortunately, our code was stopped by the poor connection to the AS7265x sensor in the few lines of code immediately after the SD card was mounted, so our indicator LEDs turned on and indicated that there were no issues. If we had a separate LED indicating when data collection started, we would have seen that it was not collecting data, and we possibly could have found the issue with the AS7265x sensor's connection before we launched.

We also could have potentially had a problem with wires crossing. If we had more time when we were putting the final pod together, we could have spent more time on wire management to make sure that they had enough heat shrink protecting them from crossing, and we could have organized the wires more so that they would be less likely to come loose or cross with each other. The issue with the charging wire on the battery seemed to have occurred during the flight. Luckily, the wire was not necessary for the battery to run. It was only necessary to charge the battery. So, this would not have affected our chances of getting results. However, better wire management and more space in the pod would have likely stopped the wire from breaking, so taking out the AS7265x sensor and organizing the wires could have prevented this. If the charging wire was able to break during the flight, it is also possible that important wires on the battery could have broken during the flight.

In conclusion, we still are not completely certain whether our design would have collected useful data had it worked as intended. There were certainly some improvements we could have made to the project that would have made it work better and more reliably.