Uturn Data Solutions is the data integration provider for one of the largest municipal smart lighting projects in the United States. The project includes upgrading over 270,000 streetlights, while also integrating with a legacy 311 system, a department of transportation GIS system, and an energy utility billing system.
During year one of the four-year project, four distinct integrations are being developed and hosted on AWS to facilitate information flows between these systems. These integrations utilize AWS Lambda, AWS RDS, AWS S3, CloudFormation, CloudWatch, SNS, CodeBuild, and CodePipeline.
AWS Lambda is being used as the primary compute resource. The AWS Lambda functions have been built using Python 3.6, with multiple workflows including:
- Every 8 hours CloudWatch events trigger polling of a third-party ArcGIS database for newly audited streetlights. The data for these streetlights are imported into the 3rd-party application that operates the smart streetlights.
- Every 24 hours CloudWatch events trigger polling of the smart streetlight management application for newly provisioned streetlights to be send to the energy utility for billing updates.
- Every 8 hours CloudWatch events trigger polling of the smart streetlight management application for new or updated alarms. The alarms are used to generate/update the 311 service requests, via the Open 311 API provided by the city.
- Every hour CloudWatch events trigger queries of the smart streetlight management application in search of updated streetlights which need to be updated in the transportation department’s GIS system.
- Every hour CloudWatch events trigger queries of the transportation department’s GIS system, in search for updates which need to be pushed to smart streetlight management application
The AWS workflows often process large batches of requests, which could take longer than the 5-minute maximum processing time. The Lambda functions have been designed to break the large batches into smaller, more manageable batches and are using AWS SNS to chain together Lambda processes. The following diagram shows how the chaining process was implemented:
This diagram shows how larger batches were pulled from the ArcGIS API and split into smaller batches. These batches were then stored in the RDS database. The last step was to trigger multiple (10) SNS messages, in order to allow for some amount of parallel processing. The subsequent Lambda function would pull the referenced batch out of RDS, update the streetlight management system for each item in the batch, then trigger the next batch.
This process has allowed the system to handle any size batch that has come its way, without running into the 5-minute max processing time and the use of 10 parallel threads helps to eliminate overwhelming the downstream systems.
These integrations are designed to be transactional, which required detailed tracking and logging of each transaction. AWS RDS, Postgres version, has been used as the primary data storage solution for this integration, in order to offer a solution that can be easily queried. The Postgres version was selected as the municipality is already skilled up with this database technology.
The RDS instance holds transactional information, which the Lambda function uses to work through its processing, and critical process error logs. Logs in RDS are only for critical errors and stack traces, as the majority of logging can be found using AWS CloudWatch.
These new AWS integrations were some of the first workloads that the city has deployed to AWS. Uturn designed this new account to showcase some of AWS’s top DevOps features.
The first DevOps feature used was CloudFormation. This was used to design the VPC, so that it could be reused for different networks the municipality requires (Public, Private, HIPAA, etc). In addition to the VPC a CloudFormation template was used to create the Lambda Function and RDS instance.
The second set of DevOps features used were AWS CodePipeline and AWS CodeBuild. The integration is utilizing AWS CodePipeline to automatically trigger an AWS CodeBuild job which does the following:
1. Pulls down code from GitHub
2. Installs Python 3.6 on Ubuntu environment
3. Creates a VirtualEnv for Python 3.6
4. Load the requirements file for the Python 3.6 application
5. Zips code and libraries
6. Uploads Zip file to S3
After the CodeBuild job completes successfully CodePipeline will trigger a CloudFormation template which will deploy the newly created zip file to the Lambda Function.
The net result of these efforts is a core set of integration capabilities that will extend to meet the varied needs of the city for years to come. The city is currently building a next-generation 311 platform that will facilitate significantly more automation and information-sharing. Future enhancements likely include actual-usage-based energy billing that is uncommon today for installations of this magnitude.
Leave A Comment
You must be logged in to post a comment.