We are proud to welcome Svenska Retursystem as a new customer to TIQQE and to be selected as their Integration partner moving forward. Svenska Retursystem is the smart cycle for our food. ‘Our background is a belief that the entire grocery industry would benefit from a common system for carriers, where responsibility would be administered and handled by a common party’.
Our system is based on standardized load carriers and standardized services that provide efficient and sustainable product flows for all players throughout the chain, from the producer to the store or restaurant. Through cooperation and reuse, we reduce the industry’s environmental impact. Our driving force isto create efficient processes at all levels and we therefore constantly work to develop simple, efficient and sustainable solutions for the industry.
This creates a high demand for internal and external integrations.
Svenska Retursystem where looking for a partner who could support them along their journey with a solution which meets their needs.
TIQQE’S Serverless Integration Service is built on AWS and provides their customers with a scalable and price efficient solution.
Welcome to TIQQE, we’re proud and honored to be part of your journey and looking forward to a long-term partnership.
Harvey Diaz, the newest architect to join TIQQE, has a solid 15+ years of software development experience, providing solutions for various business verticals – ranging from digital ad service providers, to enterprise resource management vendors, to platform integration specialists, earning him a mark as well- rounded software architect. But the last 7 years has been more focused on specializing on the e-commerce industry and its integrations with shipping carriers and third- party logistics providers.
He was one of the main developers of an ecommerce software which proudly features self-integration with marketplaces, shipping and warehouse management, and which automates order syncing, label printing, bin pickup allocations, AI-assisted on-demand packing and pack railways, among others.
He was previously a volunteer-trainer for free and open source software in Southeast Asia as part of the International Open Source Network, a United Nations Development Programme regional capacity building initiative. At free time, besides learning new technologies, he engages in community service and volunteerism, trying his best to make a positive change in society. He is currently the president of a local chapter of the Junior Chamber International in Cebu City.
In November last year I was contacted by Mathias and Oskar who attend their last year of the Information technology program at Örebro University. They wondered if they could do their SUP (System Development Project) at TIQQE during this semester and now they will finally present what they came up to!
Who we are and what is the SUP?
We are two students from the Information technology program at Örebro University, who have studied together since the beginning of our studies. We are currently in our last semester. Our last semester involves carrying out a system development project, where you either do your own project or you carry out a collaboration project with external actors (such as a company).
The point of the SUP (system development project) is to apply and develop previously acquired skills in a longer project. The project means that we plan and carry out independent work where the focus is problem solving and application. This involves all parts of the system development, from analysis to implementation.
How did we end up at TIQQE?
For us, there was no doubt that we wanted to do the SUP at a company. So last year in November, we started thinking about what kind of company would suit both of us. For us, it was important to find a company that could offer a challenging project. But it was at least as important that the company’s culture and values were in line with our own, this was important because we wanted the best opportunities for personal and professional growth. We wanted to be in a place where it feels good and that we actually contribute something. So we came in contact with an TIQQE:r who told us about TIQQE, the culture, values and what technologies they use. This caught our interest, so we contacted TIQQE because we wanted to know more about them and if they could possibly be interested in offering us a project. We had an interview, where it turned out that they were interested in a SUP project. They were able to offer us what we were looking for and best of all we got the same good feeling from those as we got from the TIQQE:r that we had spoken to earlier. So the journey started from here.
What is the project and what was the result?
TIQQE had a time reporting and salary system that was working but the current processes were time-consuming and not sustainable for the future. It involved a lot of manual work, so TIQQE wanted us to review this and see if we could automate this instead.
We started by reviewing the current time reporting- and salary system to identify unnecessary time-consuming processes and then investigate what can be done smarter and what can be automated. After investigating we came up with a new design, created and implemented it.
The solution for the new salary system ended up to work like this:
First we are collecting all the absence time entries from the past month from Clockifys API by making some GET requests. This data is then sorted so only TIQQE employees’ time entries are left. The data is then transformed into an XML file and saved in an S3 bucket. This is done with AWS Lambda. The Lambda then creates a temporary link for the file to be downloaded within 7 days. The link gets mailed out with SES to a group mail owned by TIQQE. From here the person responsible for the salary can just download the XML file from the group mail and import it to Visma (The salary system).
When the automation of the salary system was implemented we started to work on creating an analyze tool with AWS Quicksight. This is done by getting a detailed report from Clockifys API by making a POST request from a Lambda. The data arrives in CSV format and gets transformed into JSON and saved in an S3 bucket. When the data is in the S3 bucket, a Crawler creates metadata for an ETL which transforms some data types into other data types and saves the newly formatted data as parquet into an S3 bucket. This data is then crawled to create metadata about the parquet files. With the new data in place, all we needed to do from here is to create SQL questions that are used in Quicksight to pick out relevant data. Then create the Diagrams to display the data. And now TIQQE can see things like total profit per month/year, profit per customer, and much more.
We have done some testing and everything seems to work. So now we are eager to put it in production.
How has it been to do SUPEN at TIQQE?
It has been a great experience. Everyone at TIQQE is super nice and we felt as if we were a part of the team since day one. They included us in meetings and activities if we wanted to join. We never got the feeling that we are “less worth” only because we are there and doing a school project, which is great. TIQQE also provided us with a good project at a perfect difficulty level. We also got a person to give us technical advice when things got a bit tricky. When we have talked to other students, some of them didn’t even get any technical specialists or advice, and this held them back quite a lot. Both from learning and from progressing with the project. We did not experience any of what some of our fellow students explained. So this was well done by TIQQE. Lastly, we want to thank everyone involved and TIQQE for allowing us to do this project. We learned a ton of new stuff.
We been very happy to have you with us and great job with your SUP!!
I knew that TIQQE worked with Amazon Web Services and that the company strived for a good company culture, which basically was what made me interested. I was looking for a job with something interesting to work with (AWS) and where I would enjoy going to work everyday and TIQQE ticked both those prerequisites off. I also knew Jacob (TIQQE’s CEO) from before, and knowing how good of a person he is, I also knew that TIQQE would be a great place to work at since he worked there!
What was your first impression of TIQQE your first week? Since we’ve been working from home the whole time it’s been a bit different compared to other jobs, but I’m also quite used to talking to people through Slack, Google Hangout and similar services so I didn’t think too much about it. But we started off with a couple of meetings and introductions during my first days so I got into contact with multiple colleagues straight away and got assigned a mentor that has been very helpful.
What is your role at TIQQE? Full Stack developer, currently working with PostNord.
How has your first time been at TIQQE? It’s been great! Everyone at TIQQE has been very nice and friendly, and I’ve gotten into a great team at PostNord. It’s a lot of fun and rewarding to work with something that is used daily!
What are you looking forward to in the nearest future? I haven’t thought too much about that – since I just recently joined both TIQQE and the team at PostNord I’m just trying to get up to speed with everything and do my best!
What do you know about TIQQE now?
That it indeed is a great company, and I’m very happy that I joined! Looking forward to see what the future brings!
If you want to read why Joakim joined TIQQE follow the link here!
Edwin is an innovative web developer who manages all aspects of the development process. He’s passionate about solving problems, creating ideas, and learning new technologies. He has a lot of experience working on different technologies such as Python, PHP, Typescript, Angular, VueJS, Docker, and so much more. He is more focused on backend development, and how to automate things.
When not at work, Edwin loves biking and motorcycle riding. He also loves playing MMORPG games and spending time with his family. For Edwin, work, hobbies, and family should be well-balanced.
What is required to build a sustainable organization, where employees choose to stay and where they can develop in an innovative environment? How can we make employees feel “this is the last company I will work for”?
Last week we listened to the first part of the podcast where we got to know Joakim and he covered: why sneakers are the best to wear, why a pizza oven should be in used in every household, the importance of taking a nice nap and what it was like to work on “the dark side “. He will also share his insights and experiences on how to build a sustainable community where the people choose to stay. And create an innovative environment where people who are part of the community feel that this will be the last company they will ever work for.
The second part of “Culture and innovation”, we will delve into culture and innovation – the symbiosis between them and how important it is that they exist within the organization. Joakim also shares his top 5 list of books to read.
Last week TIQQE’s Joakim Restadh was a guest at ZervicePoints Podcast.
He will share real life experiences both from the past and present and most importantly the lessons learned down the road.
In the first part we will get to know Joakim where he will share: why sneakers are the best to wear, why a pizza oven should be in used in every household, the importance of taking a nice nap and what it was like to work on “the dark side “. He will also share his insights and experiences on how to build a sustainable community where the people choose to stay. And create an innovative environment where people who are part of the community feel that this will be the last company they will ever work for.
The second part will be connected to “Culture and innovation”, we will deep dive into culture and innovation – the symbiosis between them and how important it is that they exist within the organization.
Joakim also shares his top 5 list of books to read.
AWS has released a new AWS SSO application in the Azure AD application gallery to make this configuration even easier! AWS has also worked with Microsoft to update the existing Azure AD gallery application names and descriptions to help people understand the difference between the use cases.
You have 3 different options to connect your Azure AD to AWS:
Use the AWS Single Sign-On Azure AD gallery application for multi-account access and single sign-on to the AWS Console, AWS Command Line Interface, and AWS SSO integrated applications.
Use the previous Azure AD gallery application, now named AWS Single-Account Access.
Use the AWS Console Access app for password vault sign-in to the AWS Console in a single account.
In this blog-post, I will go through option number 1 and follow you through the process step-by-step.
Organizations usually like to maintain a single identity across their range of applications and AWS cloud environments. Azure Active Directory (AD) is a common authentication method as Office 365 often is used among companies, and might be the hub of authentication as it often is integrated with other applications as well.
If you are using AWS Single Sign-On (SSO) you can leverage your existing identity infrastructure while enabling your users to access their AWS accounts and resources at one place. By connecting Azure AD with AWS SSO you can sign in to multiple AWS accounts and applications using your Azure AD identity with the possibility to enable automatic synchronization of Azure AD Users/Groups into AWS SSO.
This makes perfect sense and often improves your ability to further automate how you handle user-lifecycle and access to your AWS accounts as you might already have some identity manager connected to your HR system like Microsoft Identity Manager or Omada in place for example. You can also leverage your existing process for applying for access to different systems, ServiceNow or similar solution might already be connected to Azure AD in one way or another which then could be leveraged for applying for access to different AWS Accounts. There are also other benefits such as levering your existing MFA solution if your organization has such a solution in place.
To the good stuff! I will in this blog-post demonstrate how you can connect your Azure AD to AWS SSO and take advantage of its capabilities.
Creating a new Azure Enterprise Application
Login to your Azure portal and open Azure Active Directory. Under the Manage section, click on Enterprise application.
Click New application and search for “AWS” select AWS Single Sign-on, give your new application an appropriate name and click Create.
Once the Azure gods have created our new application, head into the Overview page and select Set up single sign-on and choose the SAML option.
Under section, SAML Signing Certificate click Download next to Federation Metadata XML.
Please keep this page open as we later need to upload the metadata XML file from AWS SSO.
Setup AWS SSO
Login to AWS management console and open AWS Single Sign-On, please ensure that you are in your preferred region. If you haven’t enabled AWS Single Sign-On already, you can enable it by clicking Enable AWS SSO as shown below.
Click Choose your identity source. You can also configure your custom User portal URL if you’d like but it is not required.
Select External identity provider. Upload the AD Federation Metadata XML file downloaded earlier inside the IdP SAML metadata section and download the AWS SAML metadata file.
In the Azure SAML page, click Upload metadata file and upload the AWS SSO SAML metadata file.
If you have configured a User portal URL earlier, you need to edit the Basic SAML Configuration section and match the Sign-on URL.
Setting up automatic provisioning
The connection between Azure AD and AWS SSO is now established, we can proceed to enable automatic provisioning to synchronise users/groups from Azure AD to AWS SSO.
Note that you can use Azure AD groups but not nested groups ie. groups that are into groups.
Head over to the Provisioning page and change the mode to Automatic. Please keep this page open as we will copy values from AWS SSO.
In the AWS SSO Settings page, click Enable automatic provisioning
Take note of both values given in the popup
In the Provisioning page in the Azure portal, expand the Admin Credentials section and insert the values from above. It is also recommended to add an email address for notification of failures.
Note that these tokens expire after 1 year and should be renewed for continuous connectivity.
Click Test Connection and it should result in a success message.
Expand the Mapping section and click Synchronize Azure Active Directory Users to customappsso
Which attributes you want to sync over depends on your setup, but default setups you can remove all attributes except: userName active displayName emails name.givenName name.familyName
You then create a new attribute mapping objectId with externalId.
Important to note is that you can modify the email attribute to use userPrincipalName over mail as not all users have Office365 licenses which leave that attribute null.
In the Provisioning page, you can now set the Status to On. It is recommended leaving Scope set to Sync only assigned users and groups. Click Save, it should take about 5 minutes for it to start synchronizing.
Our AWS SSO and Azure AD connection is now fully set up, when you assign Azure Users/Groups to the enterprise app, they will then appear in AWS SSO Users/Groups within around 40 minutes.
Creation and assignments of AWS SSO Permission Sets
Using Permission Sets, we can assign permissions to synchronized Groups and Users, these permission sets will later create IAM roles in accounts which they are assigned. You can create new Permission Sets based on AWS Identity and Access Management (IAM) managed policies or create your own custom policies.
To create a new Permission Set in the AWS Management console you can follow the below steps:
Go to the AWS SSO management portal, in the navigation pane, choose AWS accounts and then the AWS organization tab.
In AWS account, choose the account that you want to create a permission set for, and then choose Assign users.
In Display name, choose the user name that you want to create the permission set for, and then choose Next: Permission sets.
In Select permission sets, choose Create new permission set.
In Create new permission set, choose Use an existing job function policy or Create a custom permission set depending on your needs, click Next Details, and then select a job function or create a custom policy or managed policy.
You can then complete the guide and click Create.
You should then see the message “We have successfully configured your AWS account. Your users can access this AWS account with the permissions you assigned”.
But be careful on how you deploy these AWS SSO Permission Sets and assignments since it needs to be executed in the Master account. You should always follow the least privilege principle and should therefore carefully plan on which approach you use to deploy these Permission Sets and assignments. If you want to automate assignments and creation of Permission Sets further, I suggest you go with an event-based approach and assign Permission Sets using Lambdas.
In this blog-post I showed how you can connect Azure AD to AWS Single Sign-On (SSO), you can now manage access to AWS accounts and applications centrally for single sign-on, and make use of automatic provisioning to reduce complexity when managing and using identities. Azure AD can now act as a single source of truth for managing users, and users no longer need to manage additional identities and passwords to access their AWS accounts and applications. Sign in is accomplished using the familiar Azure AD experience, and users will be able to choose the accounts and roles to assume in the AWS SSO portal.
You now also have the possibility to use your existing automation process on how you apply for access, grant and revoke access to systems.
If you have any questions or just want to get in contact with me or any of my colleagues, I’m reachable on any of the following channels.
What tools are available if I want to start building my own serverless applications? We will go over some of the most popular frameworks and tools available for the developer who wants to get started with AWS Lambda.
There are a lot of tools out there to use when building software that are powered by AWS Lambda. These tools aim to ease the process of coding, configuring and deploying the Lambdas themselves but also the surrounding AWS infrastructure. We will discuss the following alternatives:
AWS Cloud development kit (CDK)
AWS Serverless Application Model (SAM)
About AWS CloudFormation
AWS CloudFormation is a service within AWS that lets you group resources (usually AWS resources) into stacks that you can deploy and continuously update. These CloudFormation templates are written in JSON or YAML and manually typing those can be very tedious. Especially when your stack grows to a lot of resources that reference each other in multiple ways. What lots of these frameworks and tools do is to provide an abstraction layer in front of CloudFormation so that the developer can more rapidly create and focus on the business value of the service they are building.
AWS Cloud development kit (CDK)
The AWS CDK went into general availability in the summer of 2019 and has been getting a lot of traction lately. It is an open source framework that lets you create your infrastructure as code instead of CloudFormation. You then generate CloudFormation templates from your code by running the command cdk synthesize.
This one has been around since 2015 and was called JAWS before quickly changing to its current name. As the very descriptive name says, it’s a framework for building serverless applications! The framework is easy to get started with and setting up an API with a few Lambdas require very little configuration for the developer as the framework takes care of the underlying CloudFormation template.
Because of its specific focus in serverless applications, the framework is not as broad as the CDK and that comes with pros and cons. You will get a lot of help if you are setting up Lambdas or events that trigger those Lambdas, but setting up the surrounding infrastructure such as queues, tables, kinesis streams and cognito user pools will often require you to write pure CloudFormation. At TIQQE, some of us like to create and deploy this surrounding infrastructure with the CDK, while developing the Lambdas and the API gateway in Serverless Framework.
Serverless Framework is open source and multi-cloud. It’s also extendable with a wide range of plugins created by the community.
AWS Serverless Application Model (SAM)
AWS SAM is another framework similar to Serverless Framework that it let’s the developer write less code when building serverless applications. Unlike Serverless Framework, SAM is specific to AWS and its main configuration file template.yml is written with CloudFormation. So if you have previous experience with AWS and CloudFormation you will likely find it easy to get started with SAM. A neat feature in SAM is that it has support to deploy APIs using swagger out of the box.
This multi-cloud tool for infrastructure as code is worth a mention! It has been around since 2014 and is written in Go. For AWS it uses the aws-sdk to manage resources instead of CloudFormation, which gives the benefit of not having a resource limit of 200 that AWS impose for CloudFormation templates.
How do I choose which one to pick?
It comes down to some characteristics of your application, and a fair bit of personal preference!
Are you building a service with an API endpoint and you have little or no previous experience in AWS or serverless architecture? We recommend you to check out Serverless Framework.
Are you not a fan of writing CloudFormation and your architecture needs a lot of infrastructure? Check out AWS CDK.
Are you familiar with CloudFormation and want to get started with serverless applications? AWS SAM could be the perfect match!
There are countless forum posts and articles debating whether to go with AWS SAM or Serverless Framework. The frameworks are very similar and many times it comes down to personal taste. At TIQQE we have lots of experience working with Serverless Framework and some of us would debate that you get the job done in less lines of code with Serverless Framework. With that said, SAM does not have to worry about being generic for multiple clouds and that can be an edge if you are working only with AWS. SAM also defaults to giving Lambda functions least privilege access rights (AWS best practice), while Serverless Frameworks share a role between all Lambdas.
Terraform can be a good match if you are creating infrastructure as code across multiple clouds. While Terraform is capable of doing many things, it is not specialised in serverless technologies and you will have to write a lot of code to achieve the same results as the other frameworks described in this post. Not having a 200 resource limit is nice but should not be a problem that often if you are designing your systems in terms of microservices.
Do you have any comments or questions about this article? Please reach out!
The 8th March Johan Karlsson joined TIQQE. After two weeks with us we asked a few questions to see if the reasons he had to join us have been met so far.
What did you know about TIQQE before you started? I worked at Enfo when TIQQE was founded and I’ve kept on eye on the company ever since.
Why did you want to join TIQQE? Because I believe in what the company stands for and because several friends and former colleagues already work here. I wanted to have a lot of smart, inspiring tech nerds around me and I like the technologies that TIQQE work with.
What was your first impression of TIQQE your first week? There is a warm and friendly atmosphere here. We’ve been a few people at the office which helps with the intro. It has felt like a big re-union.
What is your role at TIQQE? Developer and tech lead.
How has your first time been at TIQQE? I’ve got a good introduction to people, processes and tools by Cajza. Clearly I’m used to working remote but starting from scratch remote is a bit different. A lot of new things to learn.
What are you looking forward to in the nearest future? I’ve started in the PostNord retail team. I’m looking forward to learning the business side, learn AWS, getting to know the team better and whole TIQQE.
What do you know about TIQQE now? It’s as nice I hoped it would be.