Exploring MongoDB Ecosystem in Skunkworks (Vaidya PoC)

Ankur Raina
5 min readJul 18, 2018

--

Several times a year, MongoDB Engineering (by the way, did you check MongoDB 4.0?), organises a 3 day peace and coding session across all the global offices. We call this Skunkworks. For these three days, engineers themselves decide on what they want to do. Preparations for Skunkworks start several months earlier, and people start adding and sharing the project ideas. Other members can choose to join someone else’s idea or throw their own and search for people (#BuildTogether) or go solo. The last Skunkworks (ended today) was from 16th-18th July.

For quite a while now, I have been interested in working on Go lang projects. This year, I had planned to utilise my first two days to solve Gophercises and Exercism problems for a revision of the Go ecosystem, and the last day to explore some small MongoDB enterprise tool repositories to get familiar with the internals of those tools. I was also discussing other interesting projects with other members of the team but in the heart of hearts, I really wanted to focus on the tech stack this year than anything else.

However, on the day of Skunkworks, my colleague Nishant and I discussed several ideas and finally I was convinced to get onboard his idea. We decided to build a small PoC application (named it MongoDB Vaidya : physician in Sanskrit; they can diagnose the disease from pulse rate) which could help MongoDB Atlas customers to receive a basic analysis of their log files (mongod, mongos, audit logs etc.) on their email everyday or on-demand. The idea was as much about fun as it was about learning.

Join us in the journey of creating this simple app with MongoDB ecosystem.

The idea was to grab the cluster log files using MongoDB Atlas API, run analysis on these log files using existing tools (such as mtools) and send the output files over an email to the customer. We decided to use Go lang to implement the backend of this application and MongoDB Stitch code for the other dirty tasks.

We first decided to build a REST API for the backend application which would allow the Stitch client to send a request for the analysis. However, we scrapped that idea later due to some complexities.

Finally, we agreed on an architecture where the:

MongoDB Vaidya
  1. client UI waits for a button click,
  2. triggers a MongoDB Stitch function
  3. and enters all the request details into a collection on a cloud database service (hosted on MongoDB Atlas).
  4. It would then be the responsibility of the backend application to poll the database server (mgo doesn’t implement Change Streams yet and MongoDB Go driver is still in alpha releases) every 30 seconds to grab the new set of requests. The application downloads the logs and performs the analysis.
  5. (A.) Once the analysis is complete the backend application uploads the output files to S3 and (B.) marks the request as “done” on the same cloud database.
  6. The Stitch application keeps listening to this change event,
  7. grabs the output files, and
  8. sends an email to the customer.
#MakeItMatter
Whiteboard Discussions (Nothing important to look at here)

Backend Code

Gophers playing on Channels

External Libraries Used (without these, it would never have been possible to complete this PoC):

  • github.com/xinsnake/go-http-digest-auth-client : for authentication on MongoDB Atlas API as it uses HTTP Digest Authentication
  • github.com/aws/aws-sdk-go/ : for upload of files on S3
  • gopkg.in/mgo.v2 : database driver (my colleague Wan’s workaround given here helped to connect to MongoDB Atlas which uses SSL)
  • github.com/kr/pty : pseudo-terminal for workaround against mtools limitation
  • Code to upload to S3 picked from a blog https://golangcode.com/uploading-a-file-to-s3/

The code is available on github repository here. (Note that this is a very rough PoC code and should not be evaluated for correctness or quality)

MongoDB Stitch Code

MONGODB STITCH AND AWS SERVICES
  1. User clicks a button and invokes a function on Stitch SDK
  2. The function inserts data into MongoDB Atlas
  3. Backend app polls for this data, and
  4. uploads the processed file output to S3 and adds file info into the database
  5. Stitch keeps listening for this event, and
  6. triggers a function which works with AWS stack for grabbing the output and wrapping in an email
  7. Email report is sent to the user

MongoDB Stitch code used in this project can be found here.

This small project has been a great learning exercise and we would like to improve on it in future and see if this can evolve as a potential new feature. By the end of this project we realised that there was so much to use and explore in the MongoDB ecosystem, and we have really come a long way on the range of offerings. e.g: MongoDB Compass helped throughout this project for quick CRUD operations during the testing phase which would otherwise need some additional effort. We are also grateful for support from our other team members who got our day to day work covered while we focused on this stuff.

There were several other interesting and advanced level projects from our other colleagues and we look forward to get details of their experience.

I hope you enjoyed reading this.

Cheers!

--

--

Responses (1)