Posts

Showing posts with the label docker

Simple FastAPI App with Docker and Minikube

 Let's start with the simplest one. Which we can develop and test in our local system or laptop, or Mac. ✅ Simple FastAPI App with Docker and Minikube (Kubernetes) 📁 Folder Structure fastapi-k8s-demo/ ├── app/ │ └── main.py ├── Dockerfile ├── requirements.txt ├── k8s/ │ ├── deployment.yaml │ └── service.yaml 📄 app/main.py from fastapi import FastAPI app = FastAPI() @app.get("/") def read_root(): return {"message": "Hello from FastAPI on Kubernetes!"} 📄 requirements.txt fastapi uvicorn 📄 Dockerfile FROM python:3.11-slim WORKDIR /app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY app/ . CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"] 📄 k8s/deployment.yaml apiVersion: apps/v1 kind: Deployment metadata: name: fastapi-deployment spec: replicas: 1 selector: matchLabels: app: fastapi template: metadata: ...

Deploy ML Application to Azure

Image
Azure DevOps + Azure ML based ML application deployment tutorial : 🔷 Introduction: Context of Azure, Azure ML & Azure DevOps 1. Azure Overview Azure is Microsoft’s cloud computing platform offering a vast ecosystem of services for compute, storage, networking, databases, machine learning, DevOps, and more. It enables organizations to build, deploy, and manage applications and services through Microsoft-managed data centers. Key Benefits: Global scalability and reliability Pay-as-you-go pricing Integrated security and compliance Strong ecosystem for enterprise DevOps and AI/ML workflows 2. What is Azure Machine Learning (Azure ML)? Azure Machine Learning is a cloud-based platform for training, deploying, automating, and managing machine learning models. It supports both code-first (Python SDK, CLI) and no-code (Designer, Studio) approaches. Key Components: Workspaces : Central place to manage assets and operations Compute Targets : For training &...

TensorRT-Specific LLM Optimizations for Jetson (NVIDIA Edge AI)

  🚀 TensorRT-Specific LLM Optimizations for Jetson (NVIDIA Edge AI) TensorRT is NVIDIA’s deep learning optimizer that dramatically improves inference speed for LLMs on Jetson devices . It enables: ✅ Faster inference (2-4x speedup) with lower latency. ✅ Lower power consumption on edge devices. ✅ Optimized memory usage for LLMs. 1️⃣ Install TensorRT & Dependencies First, install TensorRT on your Jetson Orin/Nano : sudo apt update sudo apt install -y nvidia-cuda-toolkit tensorrt python3-libnvinfer Confirm installation: dpkg -l | grep TensorRT 2️⃣ Convert LLM to TensorRT Engine TensorRT requires models in ONNX format before optimization. Convert GGUF/Quantized Model → ONNX First, convert your LLaMA/Mistral model to ONNX format: python convert_to_onnx.py --model model.gguf --output model.onnx (Use onnx_exporter.py from Hugging Face if needed.) 3️⃣ Optimize ONNX with TensorRT Use trtexec to compile the ONNX model into a TensorRT engine: trtexec --onnx=mo...

Convert Docker Compose to Kubernetes Orchestration

If you already have a Docker Compose based application. And you may want to orchestrate the containers with Kubernetes. If you are new to Kubernetes then you can search various articles in this blog or Kubernetes website. Here's a step-by-step plan to migrate your Docker Compose application to Kubernetes: Step 1: Create Kubernetes Configuration Files Create a directory for your Kubernetes configuration files (e.g., k8s-config). Create separate YAML files for each service (e.g., api.yaml, pgsql.yaml, mongodb.yaml, rabbitmq.yaml). Define Kubernetes resources (Deployments, Services, Persistent Volumes) for each service. Step 2: Define Kubernetes Resources Deployment YAML Example (api.yaml) YAML apiVersion: apps/v1 kind: Deployment metadata:   name: api-deployment spec:   replicas: 1   selector:     matchLabels:       app: api   template:     metadata:       labels:         app: api     spec:...

Microservices Application with Flutter Flask MongoDB RabbitMQ

A complete microservice application setup with a Flutter app, MongoDB, and RabbitMQ, along with all the necessary files and folder structure. The setup uses Docker Compose to orchestrate the services. Folder Structure ``` microservice-app/ │ ├── backend/ │   ├── Dockerfile │   ├── requirements.txt │   ├── main.py │   └── config.py │ ├── frontend/ │   ├── Dockerfile │   ├── pubspec.yaml │   └── lib/ │       └── main.dart │ ├── docker-compose.yml └── README.md ``` 1. `docker-compose.yml` ```yaml version: '3.8' services:   backend:     build: ./backend     container_name: backend     ports:       - "8000:8000"     depends_on:       - mongodb       - rabbitmq     environment:       - MONGO_URI=mongodb://mongodb:27017/flutterdb       - RABBITMQ_URI=amqp://guest:guest@rabbitmq...

Code Generation Engine Concept

Architecture Details for Code Generation Engine (Low-code) 1. Backend Framework: - Python Framework:   - FastAPI: A modern, fast (high-performance) web framework for building APIs with Python 3.6+ based on standard Python type hints.   - SQLAlchemy: SQL toolkit and Object-Relational Mapping (ORM) library for database management.   - Jinja2: A templating engine for rendering dynamic content.   - Pydantic: Data validation and settings management using Python type annotations. 2. Application Structure: - Project Root:   - `app/`     - `main.py` (Entry point of the application)     - `models/`       - `models.py` (Database models)     - `schemas/`       - `schemas.py` (Data validation schemas)     - `api/`       - `endpoints/`         - `code_generation.py` (Endpoints related to code generation)     - `core/`       - `config.py` (Configu...

Compare Ububtu and MacOS

  Features #Ubuntu Desktop #macOS Overall developer experience: Ubuntu Offers a seamless, powerful platform that mirrors production environments on cloud, server, and IoT deployments. A top choice for AI and machine learning developers. macOS Provides a user-friendly and intuitive interface with seamless integration across other Apple devices. Its well-documented resources and developer tools make it attractive for developers within the Apple ecosystem. #Cloud development: Ubuntu Aligns with Ubuntu Server, the most popular OS on public clouds, for simplified cloud-native development. Supports cloud-based developer tools like #Docker , LXD, MicroK8s, and #Kubernetes . Ensures portability and cost optimisation since it can run on any private or public cloud platform. macOSRelies on Docker and other #virtualisation technologies for cloud development. Has seamless integration with iCloud services and native support for cloudbased application development. #Server operations: Ubuntu...