• Let's make Cloud ☁️
  • Posts
  • Let's make Cloud #44: Migrating AWS Infrastructure From Terraform to AWS CDK, A GitOps Terraform Controller, Deploy Generative AI Models on Amazon EKS

Let's make Cloud #44: Migrating AWS Infrastructure From Terraform to AWS CDK, A GitOps Terraform Controller, Deploy Generative AI Models on Amazon EKS

Migrating AWS Infrastructure From Terraform to AWS CDK, A GitOps Terraform Controller, Deploy Generative AI Models on Amazon EKS

Hello CloudMakers!

Today we shall see:

  • Migrating AWS Infrastructure From Terraform to AWS CDK

  • A GitOps Terraform Controller

  • Deploy Generative AI Models on Amazon EKS

Enjoy!

Migrating AWS Infrastructure From Terraform to AWS CDK

Terraform has been a constant in the landscape of cloud infrastructure in the last years. However, the emergence of AWS Cloud Development Kit (AWS CDK) introduces new dynamics. This article addresses those familiar with both tools: the author details the process of transitioning from Terraform to AWS CDK, backed by real-world experience. For those considering the shift or wanting to understand the distinctions, this article provides a clear perspective.

TF-controller for Flux to reconcile Terraform resources in the GitOps way

TF-controller is a tool designed to work seamlessly with Flux, facilitating the reconciliation of Terraform resources using GitOps principles. What stands out about TF-controller is its flexibility; you can choose how deeply you want to integrate GitOps into yout operations. For instance, while some might opt for a comprehensive GitOps approach, others might prefer a more selective approach. This could involve applying GitOps to specific components of an existing EKS cluster, like its nodegroup or security group.

Additionally, TF-controller caters to those who possess a TFSTATE file and wish to either strictly enforce its state or simply detect drifts, allowing them to make informed decisions when discrepancies arise. In essence, TF-controller offers a spectrum of GitOps models to suit varied infrastructure management needs.

Deploy Generative AI Models on Amazon EKS

Generative AI is at the forefront of tech advancements. However, not everyone wants to rely solely on out-of-the-box services from cloud providers. For the advanced users who prefer to take the reins and deploy their own models, this article is tailored for you. With tools like JupyterHub, Argo Workflows, Ray, and RayServe, it delves into setting up Large Language Models (LLM) on Kubernetes, specifically on Amazon Elastic Kubernetes Service (Amazon EKS). If you're seeking a more hands-on approach to harnessing Gen AI, this guide will provide the insights you need to get started.

Thank you for reading my newsletter!

If you liked it, please invite your friends to subscribe!

If you were forwarded this newsletter and liked it, you can subscribe for free here:

Have you read an article you liked and want to share it? Send it to me and you might see it published in this newsletter!

Interested in old issues? You can find them here!