News & Events


This page contains a single entry from the blog posted on June 11, 2009 8:09 AM.

The previous post in this blog was Congratulations!.

The next post in this blog is PhD Proposal Defense - Chris Wilson.

Many more can be found on the main index page or by looking through the archives.

Powered by
Movable Type 3.34

« Congratulations! | Main | PhD Proposal Defense - Chris Wilson »

MS Defense - Andrew Rickert

Andy successfully defended his masters thesis on June 11, 2009

Neural Networks and Atmospheric Scattering Calculations

For the majority of the particles in the atmosphere, calculations of scattering energy loss are increasingly accurate in proportion to the computing time afforded them. Accurate radiative transfer calculations, even with the most efficient numerical methods, are computationally expensive. This becomes a serious problem for multi-decadal climate simulations for which an accurate representation of the radiative impact of atmospheric constituents is crucial. This thesis presents one method for reducing the computational expense radiative transfer calculations of aerosol scattering properties, which are used in chemical models. The goal of this research is to develop a fast scattering code using a neural network that is trained on input and output data derived from an accurate T-Matrix scattering algorithm. The input space to the neural net consists of scattering parameters that describe the atmospheric scattering conditions such as wavelength of incoming light, effective particle radius, and index of refraction. The output space consists of the coefficients of a Legendre polynomial expansion of the phase function. The neural net finds the nonlinear mapping between the input and output spaces for a training set and can subsequently be used to generate the phase function for arbitrary wavelength, particle radius and index of refraction. In this research, a neural network applicable to both Lorenz and Mie scattering is developed and tested for both accuracy and speed. The accuracy of the neural net is found to be excellent, with errors well below 10%, and runtime testing shows that the neural net is approximately 5 times faster than a lookup table.