AWS

Simply: AWS DynamoDB

My “Simply AWS” series is aimed at absolute beginners to quickly get started with the wonderful world of AWS, this one is about DynamoDB.

What is DynamoDB?

DynamoDB is a NoSQL, fully managed, key-value database within AWS.

Why should I use it?

When it comes to data storage, selecting what technology to use is always a big decisions, DynamoDB is like any other technology not a silver bullet but it does offer a lot of positives if you need a document based key-value storage.

Benefits of DynamoDB

How do I start?

First you’re gonna need an AWS account, follow this guide.

Setting up our first DynamoDB database

If you feel like it you can set your region on the top right corner of the AWS console, it should default to us-east-1 but you can select something closer to you, read more about regions here.

From the AWS console, head to Services and search for DynamoDB, select the first option.

The first time you open up DynamoDB you should see a blue button with the text Create Table, click it.

Now you’re presented with some options for creating your table, enter myFirstTable (this can be anything) in the Table name.

Primary key

A key in a database is something used to identify items in the table and as such it must always be unique for every item. In DynamoDB the key is built up by a Partion key and an option Sort key

  • Partition Key: As the tooltip in the AWS console describes the Partion key is used to partion data across hosts because of that for best practice you should use an attribute that has a wide range of values, for now we don’t need to worry much about this, the main thing to takeaway is that if the Partion key is used alone it must be unique
  • Sort key: if the optional sort key is included the partion key does not have to be unique (but the combination of partion key and sort key does) it allows us to sort within a partion.

Let’s continue, for this example I’m gonna say i’m creating something like a library system, so I’ll put Author as the Partion key and BookTitle as the sort Key.

Note that this is just one of many ways you could setup this type of table and choosing a good primary key is arguably one of the most important decisions when creating a DynamoDB table, what’s good about AWS is that we can create a table, try it out, change our minds and just create a new one with ease.

Next up are table settings, these are things like secondary indexesprovisioned capacityautoscalingencryption and such. It’s a good idea to eventually get a bit comfortable with these options and I highly recommend looking into on-demand read/write capacity mode, but as we just want to get going now the default settings are fine and will not cost you anything for what we are doing today.

Hit create and wait for your table to be created!

Now you should be taken to the tables view of DynamoDB and your newly created table should be selected, this can be a bit daunting as there is a lot of options and information, but let’s head over to the Items tab.

From here we could create an Item directly from the console (feel free to try it out if you want) but I think we can do one better and setup a lambda for interacting with the table.

Creating our first item

If you’ve never created an AWS lambda before I have written a similar guide to this one on the topic, you can find it here.

Create a lambda called DynamoDBInteracter

Make sure to select to create a new role from a template and search for the template role Simple microservice permissions (this will allow us to perform any actions agains DynamoDB).

After creating the lambda we can directly edit it in the AWS console, copy and paste this code.

const AWS = require('aws-sdk')
const client = new AWS.DynamoDB.DocumentClient();
exports.handler = async (event) => {
    try {
        console.log(event.action)
        console.log(event.options)

        let response;
        switch (event.action) {
            case 'put':
                response = await putItem(event.options);
                break;
            case 'get':
                response = await getItem(event.options);
                break;
        }
        return response;
    } catch (e) {
        console.error(e)
        return e.message || e
    }
};


let putItem = async (options) => {
    var params = {
      TableName : 'myFirstTable',
      Item: {
         Author: options.author,
         BookTitle: options.bookTitle,
         genre: options.genre
      }
    };

    return await client.put(params).promise();
}

let getItem = async (options) => {
    var params = {
      TableName : 'myFirstTable',
      Key: {
        Author: options.author,
        BookTitle: options.bookTitle
      }
    };


    return await client.get(params).promise();
}

hit Save then create a new test event like this.

{
    "action": "put",
    "options": {
        "author": "Douglas Adams",
        "bookTitle": "The Hitchhiker's Guide to the Galaxy",
        "genre": "Sci-fi"
    }
}

and run that test event.

Go back to DynamoDB and the Items tab and you should see your newly created Item!

Notice that we did not have to specify the genre attribute that is because DynamoDB is NoSQL it follows no schemea and any field + value can be added to any item irregardless of the other items composition as long as the primary key is valid.

Retrieving our item

Now let’s try to get that item, create another test event like this.

{
    "action": "get",
    "options": {
         "author": "Douglas Adams",
        "bookTitle": "The Hitchhiker's Guide to the Galaxy"
    }
}

and run it.

You can expand the execution results and you should see your response with the full data of the item.

Congratulations!

You’ve just configured your first DynamoDB table and performed calls against it with a lambda but don’t stop here the possibilities with just these two AWS services are endless, my next guide in this series will cover API-Gateway and how we can connect an API to our lambda that then communicates with our database table, stay tuned!

What’s next?

As I’m sure you understand we’ve just begun to scratch the surface of what DynamoDB has to offer, my goal with this series of guides is to get your foot in the door with some AWS services and show that although powerful and vast they are still easy to get started with, to check out more of what calls can be made with the DynamoDB API (such as more complicated queries, updates and scans as well as batch writes and reads) check this out, feel free to edit the code and play around.

I would also like to recommend this guide if you want even more in-depth look into DynamoDB, it covers most of what I have here but more detailed and also goes over some of the API actions mentioned above.

Contact me

Questions? Thoughts? Feedback?
Twitter: @tqfipe
Linkedin: Filip Pettersson

AWS

AWS Summit Online 2020

June 17, 2020 – You are well on your way to the best day of the year for cloud! Join the AWS Summit Online and deepen your knowledge with this free, virtual event if you are a technologist at any level. There is something for everybody.

Hear about the latest trends, customers and partners in EMEA, followed by the opening keynote with Werner Vogels, CTO, Amazon.com. All developers at TIQQE are always attending Werner’s keynotes.

After the keynote, dive deep in 55 breakout sessions across 11 tracks, including getting started, building advanced architectures, app development, DevOps and more. Tune in live to network with fellow technologists, have your questions answered in real-time by AWS Experts and claim your certificate of attendance.

So, whether you are just getting started on the cloud or are an advanced user, come and learn something new at the AWS Summit Online.

Want to get started with AWS? At TIQQE, we have loads of experience and are an Advanced Partner to AWS. Contact us, we’re here to help.

What to expect

Serverless

Simply: AWS Lambda

Why should I use AWS Lambda and how does it work? In this blog post I provide you with a practical hands-on guide of how to create your first AWS Lambda service and explain why you should use it to create awesome customer value.

What is AWS Lambda?

With AWS lambda we can write code and execute it without caring about configuring servers.

Why should I use it?

It enables you to quickly develop business relevant code and deliver value for your customers and stakeholders.

How do I start?

First you’re gonna need an AWS account, follow this guide.

Creating our first Lambda

From the AWS console head to Services and search for Lambda select the first option.

Click Create Function

Enter your name for the lambda and select runtime (I’m going with Node.js) Leave everything else default.

Writing code

When your lambda is created you’ll be taken to that lambdas page where you can see and setup lots of information and options about your lambda, let’s not worry too much about that right now and just scroll down to “Function Code”.

Using the inline editor (you are of course able to write code with any IDE you want and deploy it to AWS but I’ll cover that in another post) let’s enter some code, this is what I used.

exports.handler = async (event) => {
console.log('event', event); // initiate animals array
const animals = ['cat', 'dog', 'tardigrade']; // get input
const input = JSON.parse(event.body).input; // concatinate animals with input
concatAnimalsInput(animals, input) // create a response object and return it
const response = {
statusCode: 200,
body: JSON.stringify(animals),
};
return response;
};const concatAnimalsInput = (animals, input) => {
if(typeof input === 'string') {
animals.push(input);
} else {
animals = animals.concat(input);
}
}

Testing our code

At the top of the screen click configure test event and create an event to execute the function with.

The event in JSON format

Hit Create and finally click the “Test” button.

After its execution you’ll see the result and the output by clicking Details in the green result box, you can also click (logs) to enter CloudWatch Logs and get a better look into all executions of your lambda.

Good job!

You’ve just created a lambda and the possibilities with it are endless, in future posts I’ll discuss how we can connect an API to our lambda via API Gateway and how we can store our data in the NoSQL database DynamoDB.

Discussion: what about the price?

With Lambda the first million requests each month are alway free after that you pay $0.20 per 1M requests and $0.0000166667 for every GB-second, read more here. Lambda is usually used together with other AWS services that might also incur cost such as Cloudwatch logs which we touched upon in this post, Cloudwatch logs also offer a free tier, 5GB of Log Data Ingestion and 5GB of Log Data Archive, which means nothing we did in this post will result in any cost even if you do no cleanup.
Read more about the economics of cloud here “Cloud is expensive”

I don’t want to use the inline code editor!

Great, me neither, I suggest as a first step either looking into exporting your code to zip and uploading to the lambda

or exploring the Serverless framework, a tool that makes it easy to deploy serverless applications such as Lambda!

You’re welcome to contact me if you have any questions.

Mail: filip.pettersson@tiqqe.com
LinkedIn: Filip Pettersson

Read more articles in this series:

Simply: AWS DynamoDB

#theTIQQEcode

It’s supposed to be scary

Imagine going to work everyday and doing the same task over and over, I reckon you would get pretty good at that task, I reckon you would feel confident in doing that task and if somebody asked you to perform that task you would feel no anxiety doing so. I also reckon that you wouldn’t grow much, you would definitely lose out on the adrenaline rush of doing something you haven’t attempted before and you certainly would not have as many interesting stories to tell at the next party you attend. Now if you think that is fine, if you’re content being in a bubble of affirmation that’s ok but spoiler alert… You’re missing out.

We should always challenge ourselves and others to step out of our comfort zone, it can be in simple tasks like talking to the new co-worker you’ve seen around the office or something bigger like making the effort to hold a knowledge sharing lunch about a topic you’re passionate about or even something colossal like taking the leap of faith to switch to serverless while everybody else is debating whether we really should use public cloud or not.

Regardless of the scale there are two things I think we should keep with us:

  1. In the process of doing something new we are evolving
  2. it is okay to fail

The first one is to motivate us through the process, by reminding us that despite the difficulties and despite the overwhelming feelings that we are becoming better by going through it. Even if in a worst case scenario you don’t develop any new skills and don’t learn anything you will still have built character by stepping out of normality — there’s just no lose case.

The second one is to make us build up courage to do something in the first place and to keep that courage after something we tried didn’t go quite as planned, because failing is a part of the process.


I must admit that as i’m writing this I’m stepping out of my comfort zone. This is the first blog post I’ve really ever written and it is scary, the thoughts running through my head are endless “will anybody even read this”, “who cares about this stuff”, “what if everybody hates it”. Regardless of this I felt that I had to do it (with some motivation from my colleagues and friends) and in the end I feel proud, I’ve gone from an idea to a three paragraph long text about something that interests me and in the process I’ve gotten to practice a bunch of skills and honestly even if no one reads this, I’m still happy that I took the time to sit down and put these word together. I hope that the next time an opportunity arises for you to step outside of your comfort zone that you think back to this post and give it a shot, I mean, the worst that can happen is that you evolve.


If something is hard or scary, do it often & improve the process until it’s not hard or scary any more.

Serverless

Re:invent comes to you

TIQQE is hosting re:Invent comes to you – 2019! Welcome to our Örebro office the 5:th of December.

We have been working with Cloud native technologies since 2012 and we’ve learned a whole lot during that journey. We want to share lessons learned and also explain why we are all in on serverless!

re:invent is a yearly event hosted by AWS in Las Vegas it spans over six days and touches on all parts of AWS.

We will treat you to a great evening in Örebro with awesome presentations of the latest and greatest within serverless technology! You will be able to enjoy food and drinks while making new connections with lots of interesting people!

Register today right here

Serverless

AWS Serverless Meetup in Örebro

Yesterday TIQQE co-hosted AWS user group meetup Örebro with Headlight, we did the wildrydes workshop together and it was a great time, we managed to complete the entire workshop with some time left for discussion and reflection.Thanks for everyone who attended and made it an awesome evening, We look forward to hosting and attending more AWS oriented meetups in Örebro in the future!

AWS Serverless Meetup Örebro 2019
Customers

Re:invent comes to you

Not everyone are able to join AWS Re:invent in Las Vegas so we decided to take Re:invent to Örebro and invite everyone interested in AWS and Tiqqe.

AWS Re:invent 2018 took place in November 25 to November 30 in Las Vegas, Nevada. As not everyone are able to join, we decided to host our own event in Örebro. We attracted around 40 visitors to our first event, which is quite amazing and proves a big interest for cloud and AWS in Örebro.

David Borgenvik, founder of Tiqqe, started up the evening

We kicked off with some food, drinks and mingle to get everyone in the right mood for the evening. As AWS are broadcasting many of the sessions live, we watched Werner Vogels keynote, which is always a highlight of Re:invent. We had prepared a number of sessions ourselves to inspire people of how we use AWS and serverless technology in real life projects.

The evening was a success and there were a lot of interesting discussions. We will definitively continue with the “Re:invent comes to you” concept next year and hope for even more participants.