Serverless Photo-Sharing App Using Amazon Web Services

March 10, 2017

As the self-appointed IT director of my new family, I was tasked with finding the best solution to easily upload, back up, and share pictures and videos of our newborn daughter. While there are a myriad of cloud services that a normal person would go for, I didn’t want to rely on them to safeguard our most precious moments. To be honest, I was also itching to create a serverless app on AWS, without committing to much cost or maintenance overhead.

AWS has a host of relatively cheap services that makes creating small serverless apps easy. The JavaScript API allows you to keep most of the logic in your browser, where you can access AWS directly or through a proxy (e.g. API Gateway). I’ll explain how I used them:

  • Cognito for authentication: The main users of this app were my family and friends, so I only had to worry about unintended mistakes as opposed to malicious abuse. This allowed me to create pre-defined users for different roles (e.g. admin and visitor), and use Cognito to authenticate them. The JavaScript API lets you safely hard-code the public codes in the web page, and log in using a password only.

  • IAM for authorization: Once users are authenticated, IAM should give them minimal privileges to do their tasks. For example, I gave file upload access to admin users, but only read access to visitors. The Principle of Least Privilege prevents users from wreaking havoc. IAM didn’t have the finest-grained access levels, but for a trusted set of users, that should be good enough. For more flexibility in authorization, of course, it has to be done on a proxy or web server.

  • S3 for storage: Amazon’s simple storage is truly simple to use. I used it to store media files, thumbnails, and the static website assets. You may make the static site public, but put media files behind Cognito. The nice thing about S3+Cognito is that you can use the Cognito token in the S3 URL and access it in your website as you normally would with hosted images.

  • DynamoDB for database: The list of galleries and files, timestamps, captions, user access, and comments have to be stored somewhere. While S3 provides limited ability to store metadata for each file, the permanently-free tier of DynamoDB has enough capacity to store them in a few tables. The NoSQL (or rather schemaless) nature of the database makes it easy to quickly add new features.

  • Lambda for processing: A serverless architecture will not be complete without a function-as-a-service component! In this case, I used a S3-triggered function to create an entry in the database and process newly-uploaded images and videos. It could do anything as simple as generating thumbnails (Lambda is pre-packed with ImageMagick), or dealing with EXIF idiosyncrasies.

As for front-end, there wasn’t much needed for this app. Besides the good ol’ AWS JavaScript API, I used Bootstrap for easy styling, Knockout for two-way data binding, and Unite Gallery for media viewer. Unite Gallery is an open-source library with a few available themes and easy setup. However, getting videos to play on mobile and JPEG EXIF orientation proved to be challenging.

If I found time to improve the app, these areas would come up next:

  • CloudFormation: As of now, most of the configuration has been manually done and its persistence is at the mercy of AWS. I can use CloudFormation to formulate how everything was set up, so it can be retrieved if anything goes wrong. Amazon provides CloudFormer to create a template from an existing architecture, but it didn’t cover the bulk of my configuration around Cognito and security policies.

  • Automatic rotation: not all browsers can show the right orientation of cellphone images based on their metadata. I can use a Lambda function to automatically rotate images based on their EXIF orientation field.

  • API Gateway: Using the combination of API Gateway and Lambda, there would be no need to give users direct access to AWS resources. This will improve the security and may make the app useful for other people who can have a more serious use case for it.

  • Backup: A scheduled task that backs up media files and database records and writes them onto cheaper storage (e.g. AWS Glacier). For a more paranoid approach, I can also store them on another cloud service provider such as Google Cloud.

I’d be happy to get feedback on the design, and ways to improve the architecture.

Chilkoot Trail: The World's Longest Museum

August 08, 2016

Chilkoot Trail, a 53-km trail from Dyea, Alaska to Bennett, BC, was the main way for gold rush prospectors in late 1890s to get to Klondike River in Yukon. This slideshow contains photos of my 5-day hike through this historic trail.

Day 0: I fly to Whitehorse, Yukon where the group will get together the day after to drive to the trailhead. I have some time to explore the city.
The world's largest wind vane at the Yukon Transportation Museum. The wind is blowing eastward.
Whitehorse in Arabic letters
A common mode of transport in the old days, used by Royal Mail.
SS Klondike was the largest ship that operated on Yukon River
SS Klondike's engine room
SS Klondike's kitchen
"Gold! Gold! Gold! Gold!" An old newspaper from Seattle shows how the gold rush started. When a large shipping of gold from Yukon arrived in the US, media started creating hype, and people desperate from the economic downturn swarmed towards Yukon to get a piece of that pie. Little did they know how brutal the winter conditions are, and how little gold they can claim after the news was already out.
Gold rush prospectors had to carry a huge amount of luggage with them. After arriving in Alaska, they had to take Chilkoot trail, and after reaching Bennett Lake, they could use Yukon River for rafting up to Dawson City (gold fields). Pictured is the 45-degree-sloped Chilkoot Pass which was the most difficult part of the trail. The Canadian police was stationed at the top of the pass.
Day 1: all packed up and ready to hit the trail. First we need to drive to Alaska for the trailhead.
Emerald Lake gives a preview of how the last part of the trail will look like.
Fireweed is Yukon's flower (their official floral emblem)
The road to Skagway, Alaska goes through White Pass, which was another major way during the gold rush. The railway on the less-steep White Pass made Chilkoot Trail obsolete in 1899.
This is how much "teeth" a train needs in order to carve through snow
Skagway, Alaska. After the gold rush, people's major occupation here was trapping animals. They now trap cruise ship tourists.
After watching a short movie about how to avoid or possibly fight bears, we take a short drive to Dyea, Alaska. The sunny weather at the trailhead is promising.
Beaver-created ponds. The first part of the trail is mostly swamps in a coastal rain forest. We have to keep making a lot of noise to scare the bears away.
Finnegan's Point is where we stop for the first night. It's after 10PM, and the sun has set behind the glacier across Taiya River.
All food, fuel, toiletry, and pretty much anything that smells has to go inside the bear cache, which is at least 50 meters away from the tents.
Day 2: It's still sunny and we are still in the rain forest. This means a lot of mosquito bites. The trail permit hanging from each hiker's backpack shows we are one of 50 people that are allowed on the trail at the same time.
Devil's Club. Not so fun to whack to find your way through.
We only had to carry around 1 litre of water at a time. Fresh streams provided a reliable source of water throughout the trail.
Making your clothes wet could mean hypothermia at night. These finicky bridges can have up to one person at a time, but they mean no need to cross the river and risk getting wet.
Old baking oven in Canyon City. This city used to have thousands of people with shops and restaurants. It existed for about two years only and then was deserted.
Large boiler in Canyon City that powered the tramway. Only the richest prospectors would pay to use the trams.
Can you spot the old telegram cable?
Stove in a shelter to warm up
Our tour guide provides warm gourmet camping food. Priceless!
The second night at Sheep Camp has rain in the forecast. Time to pitch the tent tightly to avoid getting wet.
Day 3: This is the big day when we have to take the steep slopes, and to add insult to the injury, it starts raining. Leaving the camp at around 6am, we are pretty miserable.
The silver lining is that rain and fog make for good pictures. We are now exiting the rain forest and going above the tree line.
The old tram station at the top of the cliff.
The Scales: the Canadian government required gold rush prospectors to bring enough necessities to survive in the brutal winter of Yukon. The supply checklist includes items such as 150 pounds of bacon and 5 yards of mosquito nets. These scales were used to ensure the 1-ton supply is up to par.
This is where the Chilkoot Pass or "golden stairs" starts. The steep slope and grave winter conditions killed many people and horses. Their bones are scattered around the trail.
The Golden Stairs: this is the most difficult part of the trail, and the rain made it more dangerous. The top of the stairs is only a false summit; there are two more stretches of stairs left after that.
Snow patch and rain make for a fun slide, even if you don't intend to.
At the top of the stairs, on the US side of the trail, there's a monument to commemorate gold rush prospectors.
This is it; the summit! We are in Canada now and we get to go to the small warming hut to escape rain and gusting winds. We meet the Canadian warden who has worked there for decades. Time-wise, we're only halfway through the third day, but the psychological sense of achievement makes you forget that.
After the summit, we have a long trek through the Alpine tundra zone of the trail.
Wild flowers carpet the kilometer-high plains
After 10.5 hours we reach the aptly-named Happy Camp with the first sight of the washrooms. Probably one of the most scenic outhouse locations!
Didn't know a warm Lipton soup in a cup could feel so good!
Hanging wet gear to dry overnight
Having put warm and dry clothes on, we go to our tents for the third night of the hike.
Day 4: today is a rather short day to rest our bodies. We get the last view of "river beauties" before we exit the alpine zone.
It's called Long Lake. Isn't it?
Deep Lake. Glacial water and the sediments make interesting hues of colour.
We are now entering the Boreal forests. There are lots of evergreen trees and lakes.
The remains of a boat. After crossing the Chilkoot pass, prospectors started building boats to use lakes and rivers to carry their 1-ton supplies. Needless to say, they only worked for a few months of the year.
The lakes turn into rapids in canyons. Many people died in this canyon, trying to haul their supplies on an unprofessionally-made boat.
We meet a porcupine (far away on the trail). This is not bad considering we won't likely see any large animals. They're usually scared away from big groups of people.
We finally arrive at Lindeman Lake.
The sandy beach and the turquoise colour reminds you of Caribbean climate. The temperature disagrees.
This shelter is at the former location of Lindeman City at the shore of Lindeman Lake. It was a huge tent city to serve gold rush prospectors, and at its peak, its population reached more than 10,000 people.
The Lindeman Camp was closed a couple of weeks before us, because of a bear attack. Luckily only properties were damaged, but the two involved bears were shot to death.
An old first-nation cemetery.
Day 5: Our last day is another 12 km from Lindeman Lake to Bennett Lake. Our trek in the Boreal forest goes along many lakes. Pictured is Bare Loon Lake. We can hear loons around the lake, but I cannot see any.
As per our tour guide, this little lake is ideal for a family of elks to call home.
We don't see any elks, but the droppings are a telltale sign of their presence.
A small trapper's hut. The roof is adorned with bones of humans and animals.
If the trail wasn't diverse enough, we now have to cross sand fields. It looks like a desert.
Lo and behold: here's Bennett Lake. Bennett itself is still meters away, but it doesn't mean you can't take a triumphant picture.
Distances between campsites and landmarks if you start from the Canadian side.
This church is the only standing building from the gold rush times. It's only a shell so no one can enter.
Bennett Train Station. This is part of the old White Pass railway. Trains still operate here, but only for tourists. Since Bennett has no road access, this is a common way for hikers to get back to civilization.
A White Pass railway worker carrying supplies around. He doesn't know how old that car is, but it still runs fine.
To end the hike on a high note, we're skipping the train and instead taking a float plane.
The pilot gives us the usual safety lecture. "The red lever opens the door, but please don't do it during the flight"
The plane takes off pretty close to the trees.
A river snaking along
The river has changed course many times and has created such beautiful shapes
View of suburban Whitehorse along the Alaska Highway. This highway was created during World War II to connect Alaska to the rest of US.
Finally, downtown Whitehorse is in sight. The plane gets ready to land on a nearby lake.
The hike concludes with receiving a certificate. It shows an old picture of prospectors taking the golden stairs.

Slideshow plugin by Pixedelic.

If you liked this, you may also like my Iceland Trip post.

JavaScript Online: hosting a static site cheaply and effortlessly

May 12, 2016

A friend of mine was trying to get hired as a software developer, so he asked me about resources to hone his skills and practice for programming interviews. I came across a few websites where you could solve programming puzzles online. Your code would be sent to a server to run in a sandboxed container, and you would shortly get the result. My friend was specifically learning JavaScript. That made me wonder how cheaply and effortlessly I can create and maintain a similar site for JavaScript only.

The most expensive parts of hosting are usually tied to the computing power of servers. No server can beat the increasingly cheap option of not having a server at all. If your website can work with static resources, you can store everything you need on a storage service, configure your domain, spend a few cents a month, and never have to break a sweat about how your website is running. But how much can you accomplish with a static website?

In the case of an online JavaScript practice service, you can get away with not having a server for many things:

  • Core: The core of your service, running users’ code, can be delegated to their browsers. For this, web workers are a great feature that cleanly isolate potentially harmful code, and are supported in modern browsers.
  • Assets: All assets on landing pages, blog posts, and other informational pages are static.
  • Security: Users can see how you are running their code, see your test data for programming problems, and reverse-engineer them. In this case, let’s agree, it is actually serving the purpose of teaching JavaScript.
  • Personalization: Local Storage can be used to store the history of their problem-solving. This doesn’t survive a browser data deletion or moving to another device, but oh well!
  • Community and Engagement: I haven’t added any feature of such nature yet, but there are free or cheap services like Disqus or SumoMe that can add comments or email collection widgets to your static page. I’m not aware of any service for a leaderboard, but I’m sure if the site becomes popular enough, I can roll my own AWS Lambda script to take care of that.

In order to create the site, I’m using a Node.js script. Jade templating engine, Markdown language, Uglify-js, and Express.js for a local test server have come in very handy. I’ve written an automated build script, where I only need to add a single JSON file for every problem, and it creates the whole page, adds it to the index and sitemap, and deploys new or updated files to Google Cloud Storage. Google Cloud Storage makes it easy to host a static website, but it doesn’t support HTTPS yet. I’m using Cloudflare’s free plan to add HTTPS, server-side analytics, a caching layer, denial-of-service protection, and even a widget to show warnings if a browser is not supported.

I might open-source this in the future, but for now, feel free to practice JavaScript online for fun or technical interviews!

Just How Bad Is The Airbnb Effect in Vancouver?

February 18, 2016

The price of real estate in Vancouver has been rising for years, and if you’re not the gambler type, renting has never been a more financially sensible option. However, according to a report by Canada Mortgage and Housing Corporation in October 2015 (PDF), the vacancy rate in City of Vancouver is at 0.6% with some neighbourhoods at as low as 0.3% (English Bay).

If you are a renter like myself, and you’ve decided to stay put for a while, it doesn’t affect you in the short term (British Columbia has a rent increase cap at less than 3% per year). But tough luck if you want to move for any reason! I’ve heared anecdotes of reasonable Craigslist listings being snatched up in the matter of hours, and what’s left is flawed units or extortionately expensive ones.

Another rumour is about people who buy properties in popular locations, and instead of putting in on the rental market, they effectively run a hotel business on Airbnb without all the pesky licenses and regulations. As much as I love using Airbnb for private rooms in cities I visit abroad, the NIMBY in me is not very happy about this trend. That’s why I decided to crunch the numbers and have a better idea of just how bad the Airbnb effect is here.

Luckily, there’s a website that continuously scrapes Airbnb listings and publishes the data. I grabbed the Vancouver data from December 2015. It contains all the listings that were available in 2015. For the purpose of this analysis, I only looked at private homes (no shared or rooms), and only those with at least 90 days of availability to filter out occasional travellers that rent out their principal residence.

The Inside Airbnb data comes with the neighbourhood of each unit, but it doesn’t exactly match the official neighbourhoods posted on the City website (e.g. Gastown is in Strathcona in one and in Downtown Eastside in the other). While the city has over 20 neighbourhoods officially, the CMHC rental report is broken down into 10 zones only. For the sake of visualization, I had to roughly map neighbourhoods to zones, but it shouldn’t affect the results much.

The following map shows Vancouver neighbourhoods with information about their rental situation (click on each to see). I calculated the estimated number of vacant units by multiplying the total number of rental units in each zone by the reported vacancy rate. The map is colour-coded by how high the number of Airbnb listings are relative to the number of vacant units (darker red for a higher ratio of Airbnb listings over vacant rentals).

It seems that central areas and those closer to transit have it worse. Downtown has more than 10.2 times Airbnb listings than it has vacant units for long-term rental, but it’s as much as 15.8 times in Mount Pleasant and Renfrew Heights. It’s not easy to draw a conclusion about how releasing all those Airbnb listings into the rental market would do. Maybe more people would move to Vancouver and the rents wouldn’t budge much. Maybe more people would move in and rents would go lower too, but at the cost of lowering the overall income tax. I leave that analysis to people with more expertise, but it’s obvious that better regulations for those underground hotel businesses would make life slightly easier for locals who can’t or don’t want to buy into the real estate insanity.

I'm Open-Sourcing My Node.js App

February 06, 2016

After over one year of working on my side project and trying to monetize it, I’ve decided to call it quits and share my journey and the source code with the community (Github). In this post, I’m going to explain the whole development process, describe my failed attempts, and get into more technical details in case someone wants to use the app as a boilerplate to create their own.

The Idea

I was itching to create my own software-as-a-service web application while working full time. After all, the idea of having a few sources of passive income is frequently sold to software developers, and I wanted to take part in the “micropreneurship” revolution. I also wanted to practice time management and prove to myself that I can build something from scratch, and possibly have an extra source of income. There’s no need to mention the rewarding sense of building things on top of all that.

The idea of an automated virtual phone system is neither original nor sexy. In fact, there are many companies out there that are doing the exact same thing (e.g. Grasshopper, eVoice, MightyCall, etc.). This meant I might be able to take over a very small part of an established market. I knew Twilio provided an easy-to-use API, so I just had to do the math and see how high the barrier to entry is.

Research and Design

I researched the feature set of top players. I gathered a list of features that I thought I could implement in a few months and prioritized them. I then used KickoffLabs to create a fake landing page advertising all those features. Next thing I needed to do was to buy some paid traffic and measure what percentage of the visitors are willing to press the “Get Started” button. Even though pressing that button doesn’t mean the intention to pay, it indicates the pitch is compelling enough for them to spare a few seconds of their time.

When I got a good conversion rate on the fake landing page (and collected a few email addresses of prospects), and compared the cost of acquiring new customers to their potential lifetime value, I started the design phase. At this stage, I carefully listed all possible features from competitors, and added some of what I thought a potential customer would need based on pure speculation. Then, I sorted them by how important they are and how much time each one of them would take me. The outcome of this stage was a high-level list of features that my minimum viable product (MVP) needed.

The next stage was technical design. I decided to use Express.js framework on Node.js, and use Twilio and Stripe as external service providers for telecom and payment processing respectively. The free tier of Amazon Web Services provided enough computing power to get a staging server up and running. I tried not to use anything too close to the cutting edge, and compromised with more established libraries like jQuery, but I leave the details for the “Under The Hood” section of this post. The technical design was based on an MVC architecture, and I only needed pen and paper to document the database design and screen mockups.

The Interesting Part: Coding!

As many other fellow software engineers can agree, coding is the most interesting part of the process. It’s certainly not the most important part though. Having been coding for 15 years, I found this part well inside my comfort zone. The real challenge was time management and staying focused. Working full time and the human need to be offline leave too little time for extra screen time. However, what really helped me was the great power of habits. I spent an hour or two every single day. No exceptions. If I was very tired or feeling down, I would work on the parts that didn’t need much attention, but the trick was to keep up the streak.

I used Trello to break down the tasks and keep track of my progress. As expected, I had to deal with numerous unexpected issues. Any line of code that you create or comes with external libraries, and any API call to service providers is a potential headache. I took pleasure in solving them one at a time, and I think the source code at its current state contains a wealth of solved problems that I’ll certainly use in my future projects.

I asked a few friends to beta-test the app for me. I applied their feedback and fixed a few bugs. The development took 3-4 months, and I was ready to launch the product after all. I used a free Bootstrap theme to design the landing page. The only money spent so far was about $40 to get the logo designed and register the domain.

Going to The Market

To an engineering-minded person without any serious business experience, this part is the most daunting. To a business-savvy person, this is probably the most important part. Even though engineers like to make fun of all those “I have a great idea and will give 5% equity to an engineer to implement it” Craigslist ads, a person that can execute well on the idea is rarer than a 10X unicorn engineer.

So I got started by creating a couple of social media accounts, and buying some paid trafic from Google search network. The way my app was designed required users to enter their credit card number before buying a virtual phone number. I was too afraid of abuse, and didn’t want to give away too much freebies to acquire new customers. The first campaign showed that I need to be more open. A good number of people signed up, but all of them stopped at the payment screen.

I added a couple of free trial options. I also commissioned a friend to create a low-budget explainer video. I also changed the on-boarding workflow so users can see the control panel before entering their credit card number. These changes allowed more people to get into later stages of the sales funnel. However, none of the users that bought a phone number started paying for it after the trial period.

I guess I didn’t have enough motivation to try inbound marketing, or cold-calling potential customers. They were certainly outside my comfort zone. Attempts at selling the website on Flippa or finding someone to help me with sales and marketing didn’t go anywhere either. The amount of effort that it takes to get any traction for this web app seemed too high, and it eventually led me to open-source it and move on.

Under The Hood: Source Code

I used Express.js on Node.js for back-end, and Bootstrap and Knockout.js for front-end. Twilio was the obvious choice for telecom API, and I hooked up my app to Stripe for payment processing. I deferred writing tests for when I had paying customers, but the software was designed robust enough to evolve into a solid application once it was proven in the market. I used exact versioning in my NPM manifest file, and included all the packages in the Git repository to remove any incompatibility or NPM infrastructure risks.

The architecture on the back-end is MVC, and the front-end uses AJAX calls to populate the page. The server merely sends a Knockout-enabled page to the client, which in turn calls an API command or two to get the required data in JSON format. By this design, I avoided the headache that managing a single-page application will give you, and provided a blank page for users as soon as possible. I think users like to see “somehintg” (the loading indicator) very soon, but show more patience for loading data after that.

There are many opportunities to refactor the code (especially in front-end views), and there are certainly bugs to be found. However, here’s a list of different libraries and components that I used or wrote:

  • Express.js: Express is a great web framework, and the rich ecosystem of middleware allows you to shop around for best solutions to common needs.
  • Passport: A useful library for user authentication. I had to write my own session store that uses MySql (session_store.js), and use bcrypt to encrypt passwords, but Passport pretty much provided the rest.
  • Sequelize: It’s a good ORM that you can use for data modelling. Express doesn’t come with an ORM, so I had to pick between this or Bookshelf. I finally went with Sequelize, and it worked just fine for me. Getting entity relationships right was a bit tricky though.
  • Q: This promise/async library helped a lot with avoiding the infamous Node.js callback hell.
  • BigNumber.js: When it comes to money calculations, you shouldn’t trust JavaScript at all. To avoid using their float numbers, I used BigNumber. It’s a bit slow to write simple math, but the reliability makes it worthwhile.
  • Connect-assets: This is an older asset packaging library. It minifies and aggregates Javascript files, and compiles stylesheet files. That’s good enough for smaller web apps.
  • Moment: The absolute best when it comes to manipulating dates. Its timezone library made working with timezones a breeze.
  • Jade: I used this for my views. It removes the HTML clutter.
  • NodeMailer: I wrote utility files to compile Jade files, and send emails via Amazon’s mail service.
  • Twilio: If you’re using Twilio, their Node.js library is a no-brainer.
  • Stripe: Not having to deal with storing credit cards is great. They provide an easy API, and all you need to store and work with is tokens instead of sensitive user information.
  • secret_sauce.js: This is a “secret” API call that I thought might be useful for invoking certain scheduled tasks, for instance cleaning up expired sessions, or charging customers. I never got to the stage that required automating them though.
  • Bootstrap: Some people may be allergic to Bootstrap at this time, but it gives you a generic well-designed-looking page for cheap.
  • Knockout.js: This is great for front-end data binding. If all you have to do is deal with a few forms on your page as opposed to a full-on single-page app, Knockout.js is your friend!


Creating Phonjour was a great learning experience for me. I don’t regret putting effort into it, and I hope the journey and the code scraps can be useful to others as well. In my future attempts at a side project, I’ll try to find a business-savvy person to take care of the business side.