loader from loading.io

MLA 017 AWS Local Development Environment

Machine Learning Guide

Release Date: 11/06/2021

MLG 035 Large Language Models 2 show art MLG 035 Large Language Models 2

Machine Learning Guide

At inference, large language models use in-context learning with zero-, one-, or few-shot examples to perform new tasks without weight updates, and can be grounded with Retrieval Augmented Generation (RAG) by embedding documents into vector databases for real-time factual lookup using cosine similarity. LLM agents autonomously plan, act, and use external tools via orchestrated loops with persistent memory, while recent benchmarks like GPQA (STEM reasoning), SWE Bench (agentic coding), and MMMU (multimodal college-level tasks) test performance alongside prompt engineering techniques such as...

info_outline
MLG 034 Large Language Models 1 show art MLG 034 Large Language Models 1

Machine Learning Guide

Explains language models (LLMs) advancements. Scaling laws - the relationships among model size, data size, and compute - and how emergent abilities such as in-context learning, multi-step reasoning, and instruction following arise once certain scaling thresholds are crossed. The evolution of the transformer architecture with Mixture of Experts (MoE), describes the three-phase training process culminating in Reinforcement Learning from Human Feedback (RLHF) for model alignment, and explores advanced reasoning techniques such as chain-of-thought prompting which significantly improve complex...

info_outline
MLA 024 Code AI MCP Servers, ML Engineering show art MLA 024 Code AI MCP Servers, ML Engineering

Machine Learning Guide

Tool use in code AI agents allows for both in-editor code completion and agent-driven file and command actions, while the Model Context Protocol (MCP) standardizes how these agents communicate with external and internal tools. MCP integration broadens the automation capabilities for developers and machine learning engineers by enabling access to a wide variety of local and cloud-based tools directly within their coding environments. Links Notes and resources at  stay healthy & sharp while you learn & code Tool Use in Code AI Agents Code AI agents offer two primary modes of...

info_outline
MLA 023 Code AI Models & Modes show art MLA 023 Code AI Models & Modes

Machine Learning Guide

Gemini 2.5 Pro currently leads in both accuracy and cost-effectiveness among code-focused large language models, with Claude 3.7 and a DeepSeek R1/Claude 3.5 combination also performing well in specific modes. Using local open source models via tools like Ollama offers enhanced privacy but trades off model performance, and advanced workflows like custom modes and fine-tuning can further optimize development processes. Links Notes and resources at  stay healthy & sharp while you learn & code Model Current Leaders According to the  (as of April 12, 2025), leading...

info_outline
MLA 022 Code AI: Cursor, Cline, Roo, Aider, Copilot, Windsurf show art MLA 022 Code AI: Cursor, Cline, Roo, Aider, Copilot, Windsurf

Machine Learning Guide

Vibe coding is using large language models within IDEs or plugins to generate, edit, and review code, and has recently become a prominent and evolving technique in software and machine learning engineering. The episode outlines a comparison of current code AI tools - such as Cursor, Copilot, Windsurf, Cline, Roo Code, and Aider - explaining their architectures, capabilities, agentic features, pricing, and practical recommendations for integrating them into development workflows. Links Notes and resources at   stay healthy & sharp while you learn & code Definition and...

info_outline
MLG 033 Transformers show art MLG 033 Transformers

Machine Learning Guide

Links: Notes and resources at 3Blue1Brown videos:   stay healthy & sharp while you learn & code  audio/video editing with AI power-tools Background & Motivation RNN Limitations: Sequential processing prevents full parallelization—even with attention tweaks—making them inefficient on modern hardware. Breakthrough: “Attention Is All You Need” replaced recurrence with self-attention, unlocking massive parallelism and scalability. Core Architecture Layer Stack: Consists of alternating self-attention and feed-forward (MLP) layers, each wrapped...

info_outline
MLA 021 Databricks: Cloud Analytics and MLOps show art MLA 021 Databricks: Cloud Analytics and MLOps

Machine Learning Guide

Databricks is a cloud-based platform for data analytics and machine learning operations, integrating features such as a hosted Spark cluster, Python notebook execution, Delta Lake for data management, and seamless IDE connectivity. Raybeam utilizes Databricks and other ML Ops tools according to client infrastructure, scaling needs, and project goals, favoring Databricks for its balanced feature set, ease of use, and support for both startups and enterprises. Links Notes and resources at   stay healthy & sharp while you learn & code Raybeam and Databricks Raybeam is a...

info_outline
MLA 020 Kubeflow and ML Pipeline Orchestration on Kubernetes show art MLA 020 Kubeflow and ML Pipeline Orchestration on Kubernetes

Machine Learning Guide

Machine learning pipeline orchestration tools, such as SageMaker and Kubeflow, streamline the end-to-end process of data ingestion, model training, deployment, and monitoring, with Kubeflow providing an open-source, cross-cloud platform built atop Kubernetes. Organizations typically choose between cloud-native managed services and open-source solutions based on required flexibility, scalability, integration with existing cloud environments, and vendor lock-in considerations. Links Notes and resources at   stay healthy & sharp while you learn & code  - Data Scientist...

info_outline
MLA 019 Cloud, DevOps & Architecture show art MLA 019 Cloud, DevOps & Architecture

Machine Learning Guide

The deployment of machine learning models for real-world use involves a sequence of cloud services and architectural choices, where machine learning expertise must be complemented by DevOps and architecture skills, often requiring collaboration with professionals. Key concepts discussed include infrastructure as code, cloud container orchestration, and the distinction between DevOps and architecture, as well as practical advice for machine learning engineers wanting to deploy products securely and efficiently. Links Notes and resources at   stay healthy & sharp while you learn...

info_outline
MLA 017 AWS Local Development Environment show art MLA 017 AWS Local Development Environment

Machine Learning Guide

AWS development environments for local and cloud deployment can differ significantly, leading to extra complexity and setup during cloud migration. By developing directly within AWS environments, using tools such as Lambda, Cloud9, SageMaker Studio, client VPN connections, or LocalStack, developers can streamline transitions to production and leverage AWS-managed services from the start. This episode outlines three primary strategies for treating AWS as your development environment, details the benefits and tradeoffs of each, and explains the role of infrastructure-as-code tools such as...

info_outline
 
More Episodes

AWS development environments for local and cloud deployment can differ significantly, leading to extra complexity and setup during cloud migration. By developing directly within AWS environments, using tools such as Lambda, Cloud9, SageMaker Studio, client VPN connections, or LocalStack, developers can streamline transitions to production and leverage AWS-managed services from the start. This episode outlines three primary strategies for treating AWS as your development environment, details the benefits and tradeoffs of each, and explains the role of infrastructure-as-code tools such as Terraform and CDK in maintaining replicable, trackable cloud infrastructure.

Links

Docker Fundamentals for Development

  • Docker containers encapsulate operating systems, packages, and code, which simplifies dependency management and deployment.
  • Files are added to containers using either the COPY command for one-time inclusion during a build or the volume directive for live synchronization during development.
  • Docker Compose orchestrates multiple containers on a local environment, while Kubernetes is used at larger scale for container orchestration in the cloud.

Docker and AWS Integration

  • Docker is frequently used in AWS, including for packaging and deploying Lambda functions, SageMaker jobs, and ECS/Fargate containers.
  • Deploying complex applications like web servers and databases on AWS involves using services such as ECR for image storage, ECS/Fargate for container management, RDS for databases, and requires configuration of networking components such as VPCs, subnets, and security groups.

Challenges in Migrating from Localhost to AWS

  • Local Docker Compose setups differ considerably from AWS managed services architecture.
  • Migrating to AWS involves extra steps such as pushing images to ECR, establishing networking with VPCs, configuring load balancers or API Gateway, setting up domain names with Route 53, and integrating SSL certificates via ACM.
  • Configuring internal communication between services and securing databases adds complexity compared to local development.

Strategy 1: Developing Entirely in the AWS Cloud

  • Developers can use AWS Lambda’s built-in code editor, Cloud9 IDE, and SageMaker Studio to edit, run, and deploy code directly in the AWS console.
  • Cloud-based development is not tied to a single machine and eliminates local environment setup.
  • While convenient, in-browser IDEs like Cloud9 and SageMaker Studio are less powerful than established local tools like PyCharm or DataGrip.

Strategy 2: Local Development Connected to AWS via Client VPN

  • The AWS Client VPN enables local machines to securely access AWS VPC resources, such as RDS databases or Lambda endpoints, as if they were on the same network.
  • This approach allows developers to continue using their preferred local IDEs while testing code against actual cloud services.
  • Storing sensitive credentials is handled by AWS Secrets Manager instead of local files or environment variables.
  • Example tutorials and instructions:

Strategy 3: Local Emulation of AWS Using LocalStack

  • LocalStack provides local, Docker-based emulation of AWS services, allowing development and testing without incurring cloud costs or latency.
  • The project offers a free tier supporting core serverless services and a paid tier covering more advanced features like RDS, ACM, and Route 53.
  • LocalStack supports mounting local source files into Lambda functions, enabling direct development on the local machine with changes immediately reflected in the emulated AWS environment.
  • This approach brings rapid iteration and cost savings, but coverage of AWS features may vary, especially for advanced or new AWS services.

Infrastructure as Code: Managing AWS Environments

  • Managing AWS resources through the web console is not sustainable for tracking or reproducing environments.
  • Infrastructure as code (IaC) tools such as TerraformAWS CDK, and Serverless enable declarative, version-controlled description and deployment of AWS services.
  • Terraform offers broad multi-cloud compatibility and support for both managed and cloud-native services, whereas CDK is AWS-specific and typically more streamlined but supports fewer services.
  • Changes made via IaC tools are automatically propagated to dependent resources, reducing manual error and ensuring consistency across environments.

Benefits of AWS-First Development

  • Developing directly in AWS or with local emulation ensures alignment between development, staging, and production environments, reducing last-minute deployment issues.
  • Early use of AWS services can reveal managed solutions—such as Cognito for authentication or Data Wrangler for feature transformation—that are more scalable and secure than homegrown implementations.
  • Infrastructure as code provides reproducibility, easier team onboarding, and disaster recovery.

Alternatives and Kubernetes

  • Kubernetes represents a different model of orchestrating containers and services, generally leveraging open source components inside Docker containers, independent of managed AWS services.
  • While Kubernetes can manage deployments to AWS (via EKS), GCP, or Azure, its architecture and operational concerns differ from AWS-native development patterns.

Additional AWS IDEs and Services

Conclusion

  • Choosing between developing in the AWS cloud, connecting local environments via VPN, or using tools like LocalStack depends on team needs, budget, and workflow preferences.
  • Emphasizing infrastructure as code ensures environments remain consistent, maintainable, and easily reproducible.