Interesting Places AI is Being Used In Travel

Artificial Intelligence (AI) is a term that is applied to an entity that mimics cognitive functions associated with human behavior. This mimicry occurs as the result of many concerted systems, each of which is tasked with processes such as learning, decision making, and language comprehension. The research and developmental efforts […]

Read More
Designing event based applications with incron

Recently we wrote an article on leveraging AWS Lambda to create event based applications using S3. However what happens when you don’t have access to S3? What if you are using FTP or shared drives? Luckily there are still solutions! One way to accomplish this on Linux  is using incron. […]

Read More
Reactive Applications with AWS Lambda

Sometimes you may find yourself requiring a CRON script to clean a file, or maybe you need to watch a directory of images to create preview thumbnails when they arrive on the server. Processes like these suffer from the same limitation; they require you to poll a script until you get a “successful” result.

This is problematic because it forces the developer to write redundancy checks in the code instead of just focusing on the core problem. Moreover, file watching utilities generally notify once the file is created, not when the file is finished writing. All of these problems must be accounted for, and result in more complexity, overhead, and development time.

This is where event driven programming can greatly reduce your development overhead. Part of that is maintaining a centralized data lake for all of your raw files. Data lakes generally maintain an event API for easy management and access of files within the lake. In our case, Amazon S3 is the data lake of choice and thanks to AWS Lambda we can hook into the S3 event API with minimal effort for simple use cases like cleaning files.

Read More
Optimizing your Docker workflow

We create a lot of single responsibility services including fetching mail, downloading groups of files, cleaning data, importing data, and many others. This requires us to create new servers that need to be monitored and maintained so we use docker containers to normalize our process and work efficiently. From testing to staging to production, docker containers provide a simplistic way to create disposable server images.

The primary drawback is most docker images lack proper setup or are not designed for your network or architecture. Below are a list of recommendations that will make creating docker containers a less time consuming process.

Read More
Moving One Billion Rows in MySQL (Amazon RDS)

So you may remember from our article in November of 2014 about our switch to Redshift, that Koddi uses Amazon Web Services (AWS) to power our platform. While we have moved some of our data to Redshift, we still have quite a bit in MySQL (RDS), and at the beginning of this year we needed to move our main database from one AWS account to another. The normal process when creating a copy of a database in RDS is to take a snapshot and spin up a new database from this snapshot. However, Amazon doesn’t allow you to share snapshots between accounts. This posed the question, how do we efficiently migrate over a billion rows of data?

Read More
Tools and Methods for Multiple Weekly Deployments

Building an advanced bidding and reporting platform doesn’t just happen overnight. Our development team is constantly working on updating and improving the platform to give our users the best possible experience. You’re unlikely to notice, but Koddi releases updates to the application two to four times each week. Everything from […]

Read More
Scalable Data Modeling with Redshift

One of the major challenges of building an advanced bidding and reporting platform is dealing with the large amounts of data we see come in and out of our system. Our database grows rapidly on a day to day basis. So we must ensure that we have enough storage to […]

Read More