Sunday, October 1, 2017

Trajectory Design for 161C

Earth-Mars trajectory


Capture at Mars to Deimos's orbit


Deimos-to-Phobos transfer
These trajectories to and around Mars were designed for UCLA's Spring 2017 Spacecraft Design course (161C). The class was taught by Dr. Dan Goebel and was a great experience. My groups' mission consists of taking scientific measurements around Deimos (the outer moon), dropping a lander there, then proceeding to Phobos (the inner moon). I designed the trajectories to make this happen by taking orbit data from the SPICE toolkit and using MATLAB's built-in ODE solver ode23. Electric propulsion was chosen over chemical propulsion after conducting a trade study and the 25cm XIPS thruster was subsequently chosen (though the NSTAR would have been slightly better in terms of minimizing launch mass).

This is the mission timeline (1.85 years total).


The launch vehicle is a Falcon 9 Full Thrust and this is the mass breakdown.


Thanks to Nick Brown, Brandon Busbee, Justin Fedasko, Vibhav Gaur, Sunderlin Jackon, and Alvaro Pons Pelufo for making this such a fun group project.


Bonus gif









Tuesday, December 20, 2016

UCLA FSAE Chassis/Suspension 2016-2017

For UCLA's 2016 Formula SAE vehicle, Mk. III, as Chassis/Suspension Lead I determined target suspension parameters (weight distribution, spring rate distribution, damping rates), suspension geometry (wishbone arrangements, motion ratios, shock placement), and I designed the bulk of the chassis. In the previous two years, I focused more on the hardware aspects of the suspension and chassis.

The determination of vehicle parameters was split into three parts:
  1. Finding acceptable wishbone geometry
  2. Using a pseudo steady-state MATLAB model to find weight and wheel rate distributions
  3. Using a five mass, five degree-of-freedom MATLAB model to determine damping rates
Part 1.
I used vsusp.com to find suspension geometries that minimize lateral roll center movement. Camber gain was only lightly considered since the Hoosier LC0 tires we planned to run are fairly insensitive to any gain. Here is a screenshot of the front suspension geometry used.


Part 2.
To characterize the vehicle's cornering characteristics, a pseudo steady-state model was derived. Only lateral acceleration is considered so this amounts to summing all the external forces and setting them equal to the vehicle mass times the lateral acceleration. The external forces acting at the tires induce a yawing moment on the vehicle which is generally not the same as that required to take the constant radius corner at constant speed, which are assumed at the beginning. Hence, the model is only a pseudo-state model. Tire data for the Hoosier 6.0/18.0-10 LC0 is parsed and interpolated to get lateral forces from the tires given the load, slip angle, pressure and inclination.

The process went as follows:

  1. Import car parameters
  2. Assume body angle, steering angle, and cornering scenario
  3. Guess input lateral acceleration
  4. Compute tire forces to compute output lateral acceleration
  5. Analyze output
  6. Return to step 2 or 3 unless an acceptable solution is found
If the output lateral acceleration matches the input acceleration and nothing weird has happened (like a tire lifting off), then the following parameters are checked:
  1. Normalized yawing moment
  2. The stability and control derivatives, dN/dB and dN/dδ
  3. Tire saturation
  4. Ride rates
In general, the input tire spring rates must be close to the output rates, but they are not particularly sensitive to changes in other parameters. (1) is used to characterize the ''under/oversteering" of the vehicle. (2) is used to determine whether changes in body or steering angle that pull the car into a tighter corner correspond to decreases in the yawing moment. (3) is used to check that the front tires saturate before the rears so that the driver can feel the front tires loose grip first and therefore have an easier time making corrections. (4) is used to make sure the car does not get too close to bottoming out.

In general, the most important parameters to tweak were the weight distribution and wheel rate distribution. The final weight distribution was around 53/47 rear-to-front and the wheel rates we ran were 125/100 lb./in front-to-rear.

Part 3.
The final thing to do was to estimate the damping rates. This was done by determining the dynamics of a five mass, five degree-of-freedom representation of the car then simulating the response of the car to bumps and cornering. The following is a graphic showing the five masses and their connections.

 

The linearized 2nd ODE system of five equations is put into state-space form and the MATLAB command lsim() is used to pass bump and cornering inputs.

Constant high speed damping was used in bump simulations, while constant low speed damping was used for cornering simulations. The following is a plot of the front wheel's response to being bumped (y-axis in meters). The Bode plots generated were also used to determine the damping rates by trying to minimize their peaks.


Search Results


The following is a plot of the chassis's step response.

The high damping rates were fiddled with to lower the peaks of the Bode response plots and the low damping rates were adjusted to minimize the settling time. The wheel damping rates were converted to spring damping rates so that they could be compared with the dyno plots produced by Ohlins, the manufacturer of the shocks were were using.

Thursday, April 3, 2014

Update: Arduino Camera Dolly


This summer I finished the second iteration of my programmable camera dolly. Everything worked relatively predictably and from the process I gained a deeper understanding of what sort of programming, design, and manufacturing processes work for a reasonably small project such as this.

Through testing my C++ menu library, I discovered that there was an increase in unpredictable behavior after adding a large number of menu objects. I'm not sure if this is due to the limitations of the Arduino or simply a lack of an extensive understanding of it's memory usage.

With respect to design and manufacturing, there were a number of simplifications that could have been made. For one, getting current to the servo that panned the camera up and down should have been approached differently. Without going into to much detail, this iteration used a copper contact plate that required CNC milling. Instead of milling such a contact plate, perhaps conductive washers fitted around the vertical axle could have been used instead. Essentially, without needing a CNC mill, the manufacturing process becomes cheaper and faster. Secondly, a number of parts should have been 3D printed instead of laser-cut from acrylic. Time should not have been spent making 3D assemblies by gluing the edges of flat acrylic. Instead, the laser-cut acrylic should have been reserved for completely 2D components. Finally, a digital servo should have been used instead of an analogue servo. In this case I overlooked a very simple and avoidable mistake.

The following test was filmed on my roof.




This next video shows the machining of the contact plate on a Haas Mini Mill.



A pictorial documentation of the manufacturing process can be found >>> here <<<.

There are a few things I'd like to note. Firstly, 400 step steppers was definitely overkill – 200 steps would have sufficed. Secondly, I just want to mention that the process for fixing the bearings was pretty cool. In most applications, the holes for which bearings are meant are heated (so as to expand the fitting space) and usually the placing material is metal. In this case, the same process worked equally effectively using a lighter near each plastic placing hole. Finally, I would like to release a final design. I want to polish the code further, make a Solidworks parts list and a laser cutting AI file, and include full directions for creating such a dolly. Essentially, I want anyone who has an interest in time-lapse photography to be able to produce a programmable camera dolly at low cost and add a new dimension to their work.


Wednesday, April 2, 2014

Arduino Library for Adafruit's 2X16 LCD Shields





In order to make programing my camera dolly's user interface easier, I decided to create my own Arduino library. While there were a couple navigation libraries for Adafruit's 2X16 LCD Shields, I wanted to learn how to write a library, something that I haven't done before. Additionally, I had in mind a simpler interface than what I found online and wanted to make numerical input more efficient by entering numbers by place value.

Here are a few features:

  • Intuitive menu navigation
  • Supports selection of multiple options
  • Numerical input of integer and decimal values
  • Optional delay for static menus
  • Optional menu headings
The .zip file containing the library itself, examples, and documentation can be found here (Last update: Oct. 11, 2014). To get started, go to the "documentation" folder and open "home.html". From here, you can learn how to install the library. Additionally, the page can direct you to several examples and the library's function list.

Many thanks to Prof. Silvan Linn, Grant Paul and Cheng Cheng for their generous help.

Tuesday, April 1, 2014

Arduino Camera Dolly




As a summer project, I decided to design and build a programmable camera dolly in order to take time-lapse footage. An Arduino microcontroller is the "brain" of the dolly and runs a medium servo and two stepper motors through a pair of Big Easy drivers. The steppers are both NEMA 17's and have 400 steps per revolution. The stepper that controls lateral movement moves the dolly at 0.1 mm per step while the stepper dictating rotational movement turns the camera 0.25 degrees per step under half-step mode. The whole system is held together by custom 3D printed parts I designed in Solidworks, two 3/4 in plastic pipes, and a few shafts, bearings, and belts. The LCD screen displays a simple interface that can be used to change the camera's path before it takes the photos that are stitched together to make a time-lapse film.


From this project, I learned a great deal about stepper motors, LCD interfacing, coding, and 3D printing. The entire process allowed me to discover the many caveats that become apparent as things start or stop working for no apparent reason. 



There’s a minor problem with the rig. The servo doesn’t keep the camera in position 100% of the time. I’m not sure if it’s because it’s not receiving enough power through the extensive wiring or it’s just too weak. Either way, it’s a bit of a problem because when it does draw too much energy the Arduino resets.



Here are a few things I learned through the process:

3D printing
  • To avoid bolts and screws. If it’s not a tank, there’s no need to design it like one. 
  • To use epoxy. Better yet, print everything in one piece if you can while still maintaining every parts’ integrity
  • How to take into account printing tolerances along the different axes
Design
  • Making sure every detail is worked out in Solidworks. It prevents things from getting messy later.
Coding
  • LCD interfacing
  • The AccelStepper library
For the next iteration:
-No bolts or screws
-Everything down to the wiring will be modeled in Solidworks
-Less wiring
-An infinite degree of rotational freedom around the z-axis

Update: The problem with the servo was simply that I was running it directly off the Arduino so the amperage was too low. This has been fixed with a UBEC that provides enough current and correct voltage. I finished CADding the the structural components of the camera dolly; all that's left is the wiring, which I've never done before, so it should be fun. Here are a couple pictures:





Many thanks to Prof. Silvan Linn at San Francisco State University for his guidance and support.  

The New York Times Crossword Puzzle

Below is a picture of the crossword puzzle that I constructed for The New York Times. Edited by Will Shortz, the puzzle was published on Monday, January 30, 2012.
To do the puzzle, simply download the PDF here. You can also read Wordplay, the NY Times blog that includes readers' comments in response to the puzzle.


Night Sky with Python

After seeing the digital planetarium exhibit at the California Academy of Sciences, I decided that I wanted to paint the night sky on my ceiling and walls using glow-in-the-dark paint. To do this, I wrote a Python program to chart each star's position and magnitude relative to a single point in space. I plan to complete the project in Summer 2013.

The images below are screenshots of the scalable vector graphics (SVG) that were generated by the program.

To see the SVG pages, click the links below:
Night sky
Ursa Major and Ursa Minor
Stargate-like projection (see 2001: A Space Odyssey)

Click here to access all files from the project, including the Python script, a readme file with detailed instructions, and a .zip archive of everything.

Night Sky

Ursa Major and Ursa Minor

Stargate-like Projection