Will Barbour and Lee Smith (TDOT) presented during a connected and automated vehicles webinar hosted by The Eastern Transportation Coalition. They discussed the I-24 MOTION testbed and its implications for CAV and human driver trajectory data. Terabytes worth of anonymized trajectory data will start being produced by the testbed, which can then be used by state DOT’s, universities, and industry for CAV and other transportation studies. Watch their presentation below or on YouTube.

I had a great discussion with Jayson Luber from Denver’s ABC 7 news on his podcast ‘Driving You Crazy!’. We talked about what traffic in a post-Covid world will look like, based on our research in ‘The Rebound’.

Listen to the episode on Apple Podcasts, iHeartRadio, or PodBean.

I recently co-authored an analysis article with Yue Hu and Prof. Dan Work on how vehicle travel times could increase if transit riders switch modes and drive instead, due to Covid-19 concerns. The analysis used data from the American Community Survey across all major US metro areas and showed large susceptibility in transit-heavy cities – modest mode switch from transit to personal vehicles could increase all road users’ travel times by 5-10 minutes each way.

While very small travel time increases (e.g., less than 5 minutes one-way) may not seem concerning, this could equate to hundreds of thousands of additional hours spent in traffic each day and unnecessary emissions and pollution. It certainly highlights the importance of transit in major cities to keep vehicle volumes at manageable levels.

Read the article here on Medium.

I am honored to be chosen as the Top Doctoral Fellow by the Dwight D. Eisenhower Transportation Fellowship Program. This was my fourth consecutive year being awarded the fellowship from FHWA and a very nice surprise to take home the Top Doctoral Fellow award on my final year of eligibility.

Almost as nice as the award was the opportunity to deliver the first presentation at the doctoral research showcase at the Tranpsortation Research Board Annual Meeting. I spoke about our variety of micromobility work at Vanderbilt.

Image credit: FHWA/USDOT Photography

In a follow-on from my summer class at Vanderbilt Summer Academy, I joined the Programs for Talented Youth (PTY) for a one-day version of my course Sensors and Big Data Analysis in the Weekend at Vanderbilt University (WAVU) program. The course covered electrical engineering of sensor prototypes, data collection and analysis, and microcontroller programming.

At a middle school STEM day hosted by Vanderbilt student chapter of the American Society of Civil Engineers, I had the opportunity to conduct a hands-on activity focused on bike infrastructure planning with over 100 students from Metro Nashville Public Schools. Students pretended to be urban planners with the challenging task of multimodal infrastructure planning with limited resources.

The event was covered by the Vanderbilt School of Engineering in a recent article.

Topics in sensor deployments, smart cities, and data analysis were all part of a class I taught at Vanderbilt Summer Academy titled “Sensors and Big Data Analysis”. I had the opportunity to design this course and teach gifted high school students from across the United States and abroad during four weeks of class and 120 hours of instruction. Students learned about electrical engineering and building their own sensor prototypes, programming microcontrollers for data collection and control tasks, and data analysis techniques in spreadsheets and Python.

Image credit: Vanderbilt Programs for Talented Youth

Along with Prof. Work, Prof. Philip, Erin Hafkenschiel, and Leigh Shoup, I co-authored an opinion article in the Tennessean, discussing the use of electric scooters in Nashville and the larger mobility challenges of the city. The op-ed is titled Scooters are here to stay in Nashville. We have to make it work. It highlights the experiences of Vanderbilt in dealing with mobility challenges around campus, and how this insight might help shape the city’s future transportation approach.

Read the article here.

I had the opportunity to travel to Norrkoping, Sweden, to present a new paper titled ‘Data reconciliation of freight rail dispatch data’ to an audience of rail operations experts at the 8th International Conference on Railway Operations Modelling and Analysis (RailNorrkoping). The work details the process for constructing an automated data reconciliation pipeline from any optimization-based dispatching model. This allows for the identification and correction of infeasible dispatching data at the network scale, which is a difficult and costly task even at small scales.

Image credit: Chris Barkan, University of Illinois Urbana-Champaign

I presented at the IEEE Intelligent Transportation Systems Conference in October 2018. My first presentation was to an invited session on smart rail transportation and my second was a presentation of my conference paper: “On the Data-Driven Prediction of Arrival Times for Freight Trains on U.S. Railroads”.

Vanderbilt University’s dockless bike share pilot with operator ofo was honored with a Tennessee Sustainable Transportation Award at the Tennessee Sustainable Transportation Conference in Knoxville, TN, on September 18. The pilot was the first dockless mobility program in Nashville and was very popular with students, faculty, and staff. I performed the analytics on the program data, which will be used to inform campus planning and infrastructure into the future. Research on data sources from active mobility systems is ongoing, particularly with emerging e-scooters. More infromation can be found in the Vanderbilt news article.

The Array of Things is a reserach instrument utilizing distributed sensors across an urban-scale area for multidisciplinary research. I, along with Derek and Prof. Work from the lab, attended the Array of Things User Workshop to assess the current state of the instrument, learn about its applications and functionality, and report the findings to NSF.

Vanderbilt University began a dockless bike share pilot program with ofo in March, 2018. The pilot garnered unprecedented interest amongst students, staff, and faculty in its first month, with over 60,000 rides. Campus administration sought to gain insight from bike data to study how people move throughout campus and to inform decisions on infrastructure.

My preliminary analytics on the first month’s trips revealed that the majority of trips were short rides across campus, but a significant number traveled off campus to specific surrounding neighborhoods and downtown Nashville. Also highlighted were several high-volume biking corridors through campus that provoke furture study to ensure mobility and safety.

Comprehensive results were presented to campus administration and leadership, and analytics on new data is ongoing.

Selected as 2018 Eno Center for Transportation Fellow

In March of 2018, I was selected by the Eno Center for Transportation for participation in the Future Leaders Development Conference. The conference focuses on the development process of national transportation policy and will take place in May 2018. It is a great opportunity to continue my education on transportation policy. I am incredibly excited to learn from the experts and leaders speaking at the conference, as well as the other selected students.

In January 2018, the resarch group was invited by the National Science Foundation to present a demonstration of our research at the Washington D.C. Auto Show. The demo consisted of a virtual reality headset reliving the ‘‘ring road experiments’’ that showed how an autonomous vehicle operating in mixed human/autonomous traffic can dampen traffic waves and lead to smoother and lower-emissions traffic.

The hardware required to run high-end virtual reality is quite powerful, which posed an issue when transporting everything needed for the demo. My personal desktop computer, equipped with a NVidia GTX 1070, had to be taken on the plane. When running the VR demo for 10 hours, I had to consider thermals of the computer and whether a more compact hardware setup, such as a laptop, could have lasted all day. Additionally, the conference center has notably slow wireless internet, so we had to figure out a way to run the VR video locally1 and ended up using DeoVR. The player was fairly good (most notably lacking a repeat function) but video codec support required some experimentation to achieve stable playback.

Here is a video similar to what was presented (credit: Fangyu Wu):

Another video showing a more comprehensive visualization:

  1. Many players support playback of videos on YouTube. 

William Barbour and Raphael Stern named 2018 Dwight David Eisenhower Transportation Fellows

In January of 2018, I was honored to be awarded the Dwight David Eisenhower Transportation Fellowship from the Federal Highway Administration. My friend and colleague, Raphael Stern, was also awarded the Fellowship. The awards were presented at the Transportation Research Board Annual Meeting 2018.

This is a discussion of the server specifications, design rationale, and building process. I will also share what I learned during the process, along with some advice for computer building.

Update October 2019

A few months ago, I replaced the Enermax Liqtech AIO cooler due to mysterious thermal throttling issues that were occurring on the server. At the time, I did not have the extra time to track down the issue fully and needed to get the server back online, so I shipped overnight a Noctua NH-U14S TR4-SP3 air cooler. That solved the problem and CPU thermals have been low and stable ever since. I shelved the Enermax AIO and didn’t give it a second though until I came across a Gamers Nexus YouTube video detailing systemic failures in these Enermax Liqtech units due to some sort of corrosion build-up and blockage. I have not yet investigated whether this was the cause of my issues, but it seems highly likely based on their strong findings.

I have updated the parts list below to reflect the new CPU cooler - I highly recommend the Noctua NH-U14S for its simplicity and performance, even on this demanding processor.

Computer specifications

The previous research workstation computer that I built had significantly less power. It was designed for 24/7 operation with an Intel Xeon quad-core CPU (E3-1275v5) and ECC RAM. As it turns out, a lot of the computation could be asynchronously parallelized quite easily to take advantage of higher core counts. So in the second incarnation of the research workstation, I opted for a CPU in the high-end desktop (HEDT) genre where they are used for tasks such as video editing and rendering.

This led to the selection of the AMD Ryzen Threadripper 1950x CPU, which boasts high core count (16 physical) without breaking the bank or sacrificing core speed (3.4+ GHz). The Ryzen series supports overclocking and the ASUS Zenith Extreme motherboard makes this entirely possible with active VRM cooling and robust power delivery. The new configuration maintains the same high speed NVMe solid state drives via M.2 interface. Not only do these drives have extremely high bandwidth, but the input/output operations per second (IOPS) and latency are both superb1. The M.2 interface on most motherboards routes through the chipset, which can hurt latency, but the Zenith Extreme appears to route two of its M.2 slots directly into the PCIe lanes. Lower speed SATA SSD’s are used for ‘‘warm’’ storage, where data is used semi-frequently, and HDD’s for data that is accessed infrequently.

The full list of components:

  1. AMD Ryzen Threadripper 1950x CPU - 16 cores at 3.4 GHz base clock
  2. Noctua NH-U14S TR4-SP3 air cooler (previously Enermax AIO, see update above)
  3. ASUS Zenith Extreme sTR4 eATX motherboard
  4. 64GB Corsair Vengeance RAM at 3466 MHz
  5. Samsung 960 EVO 1TB NVMe solid state drives - hot/active storage
  6. Samsung 850 EVO 1TB SATA solid state drives - warm storage
  7. Western Digital 8TB data center HDD’s - cold storage and backup
  8. EVGA GeForce GTX 1080 hybrid-cooled GPU
  9. EVGA 1000 watt power supply (80-plus Gold)
All components gathered before assembly.

A number of components were actually tough to get a hold of at the time they were purchased. The release of Threadripper was still cooling down and the Zenith Extreme was in and out of stock. The global DRAM shortage was still in full swing so DDR4 prices were exorbitant. On top of that, cryptocurrency frenzy was really ramping up; thankfully I got the GTX 1080 before it reached its fever pitch and GPU’s were around double their intended retail price.

Test assembly

It is always a good idea to test components outside the case where assembly is painless. Sending a component for RMA is a lot easier when it’s not screwed down. I typically place my motherboard on the cardboard insert that comes in the box; components aren’t as fragile as they look2.

As you can see, the Threadripper CPU is absolutely massive. It’s dimensions are daunting, but it is also plainly heavy. We know that the heat spreader is soldered directly to the processor dies for better heat transfer (compared to thermal paste/adhesive), but I suspect that the heat spreader is considerably more robust than the typical processor. This complements the AIO liquid cooler that was used for the CPU, which has a full-coverage water block to take advantage of the distributed heat generated by the processor dies. AIO liquid coolers for smaller sockets vary quite little in performance, but it is clear that the heat dissipation capacity of this Enermax cooler is significantly higher for the TR4 socket3.

Initial test assembly outside the case. The size of that CPU still astonishes me. Note the strange orange sTR4 bracket; it's an interesting concept and were it not for the issues with the Foxconn sockets[2], I would be completely on board with it.

Final assembly

This was on the longer side of assembly times. The Thermaltake case was nice and spacious (massive, in fact) but a number of other factors contributed to a tedious installation. These included: separate fan extension board that came with minimal instructions for installation, long runs required for fan cables that dictated some placements, and crazy tight screws on the case for which I had only a tech-sized screwdriver (see Tools recommendations 1 and 2 below). I thought I had a lot of components, but this case could easily fit another hybrid GPU, 4-5 additional hard drives, and 1-2 additional fans. Airflow seems excellent (this has been a problem in some newer glass cases) and the build certainly makes a statement.

Final assembly in the case with 360mm all-in-one liquid cooler. The quality of the Enermax cooler was rather impressive. Some reviewers noted that one can even drain the cooler and refill with coolant. I noticed a bit of turbulence noise coming from this one, so it may need a top-off in the future.
Final assembly in the Thremaltake case. It still looks a bit empty - a lot of that space is for custom water cooling loop components. If this was a personal build, I think I would have gone that route.

Aftermath

The BIOS on the ASUS motherboard was decent out of the box. I noticed a few stability issues, but these seemed to be cleaned up with a BIOS update that was rather painless via USB drive. I have not been able to get the RAM to overclock; the DOCP profile shows up in the BIOS, but the system won’t post after attempting to enable it for the rated 3466 MHz. I have not yet attempted CPU overclocking, but will possibly do so in the future.

Windows 2016 Server was installed as the operating system (hence why I refer to the machine as a server and workstation interchangably). The primary factor in the decision was supporting multiple simultaneous users. I’m aware that any number of other operating systems can do this, but Windows is supported by the technology staff. I generally prefer a Unix-based OS, but so far the Windows Server installation has been nice and clean, free of bloatware, and compatible with most Windows 10 applications.

Tools

There aren’t really that many tools required to assemble a computer. If absolutely necessary, the vast majority of components could be installed with a single phillips head screwdriver. However, not all tools are created equal; here are my suggestions:

  1. iFixit Pro Tech Toolkit - The screwdriver and bit set included with this kit is the best I have ever used. I have used these bits quite aggressively and they haven’t shown any significant wear, unlike the many others that I have seen round off after slipping on a single screw.
  2. Full-size screwdriver - Multi-bit ratcheting screwdrivers work well for computers (a full-length and stubby version can both be useful). Some screws, particularly on steel cases, can be far too tight and difficult to remove with a small screwdriver.
  3. Flush cutters - Useful for cutting wires, zipties, etc.
  4. Long needle-nose pliers or hemostat - Either can be used for installing cables in tight spaces.
  5. Box cutter or knife - Packaging for high-quality components can be stubborn.
  6. Zip ties, painters tape - Good for securing cables permanently and temporarily (while planning cable routes).
  1. I tried a pair of these drives (Samsung 950 PRO) in RAID-0 configuration. The increase in bandwidth was nearly double, but the IOPS did not improve substantially (as was expected). Ultimately, the hassle with the BIOS settings and operating system compatibility was not worth the tradeoff. 

  2. There have been many reports of CPU installation difficulty for the Threadripper socket (sTR4). Apparently, there are two socket manufacturers - Foxconn and Lotes - and screws on the former are consistently about a half thread too short. From what I felt when tightening the retainer on my Foxconn socket, it was far more effective putting pressure on the screw when I could support the socket backplate directly with my hand instead of flexing the motherboard if it were installed with standoffs or resting on the box. 

  3. https://www.gamersnexus.net/hwreviews/3119-360-vs-240-for-threadripper-enermax-liqtech-vs-noctua 

This project was completed by Raphael Stern and myself. We set out to design and build a laser cut relief map of the Dwight D. Eisenhower National System of Interstate and Defense Highways. This resulted in a number of incarnations, one of which was mounted on stained oak plywood, below. The map and name plate are cut from 1/8-inch birch plywood using a 75-watt laser cutter using an AutoCAD DWG file. The file combined both raster engraving for the highways and vector through-cuts for the outline of the contiguous United States. The vector cut was simply sized up with AutoCAD to form the lower relief map. A later version of this project also added state outlines with lower raster width and turned out very well.

The Chicago skyline is one of most recognizable in the world and was the inspiration of this project. I set out to present the most iconic city landmarks in an accent-lighted display to mimic a sunset behind the city.

The skyline was drawn in AutoCAD, half traced roughly from a skyline image and half drawn in as a supplement to cover the remainder of the landmarks (including the Hancock Center, ferris wheel on Navy Pier, and river bridge). It took a few trials on the laser cutter to dial in an acceptable scale, since there is some exceptionally fine detail. The final cut was done on 1/4-inch oak plywood and even on full power, the laser had trouble penetrating. A couple tedious hours with a razor knife brought the cuts all the way through. I didn’t apply any wood finish to the cut skyline because it already had a beautiful color that contrasted with the other pieces.

I built the shadow box from 1x4-inch oak. I would have liked to take a router to the edges, but I don’t have one yet. The bevels were cut with a circular saw, since that was all I had at the time; they had to be done carefully to ensure the alignment was square. I also lacked a pocket hole guide at the time, so I had to free-hand drill pocket holes and counter sinks. This actually would have worked out perfectly, had I bought better quality screws. I kept the pilot holes quite proud and a few of the screws broke off in the holes due to the torque I placed on them. After cutting any protruding screws off with a Dremel and filling the holes with wood putty, the mistake was hidden. I applied two coats of red oak stain to create some contrast with the skyline but keep the grain visible.

The back of the shadow box is a piece of 1/4-inch poplar plywood with a coat of clear shellac. I recessed it from the back of the shadow box by trimming the inner perimeter with quarter-round molding, stained the same color as the walls. This was where I encountered my second mistake: the oak walls were very difficult to hammer into with finish nails after assembly of the box. Had I thought ahead, I would have nailed on the molding prior to cutting the walls with the bevel, which would have taken care of the slight gaps left by lack of precision in the miter cut of the molding as well.

The final steps were mounting the skyline using some scrap quarter-round molding and wood glue and adding a USB-powered LED strip behind the skyline. The light strip is well-hidden from the front and has a thin cord that can be routed out of the box and will plug into any standard USB charger. The color changing effect it creates is incredible in the dark.