Back to Skills

devops

mrgoonie
Updated Today
19 views
738
118
738
View on GitHub
Metaaidesigndata

About

This DevOps skill enables developers to deploy and manage cloud infrastructure across Cloudflare's edge platform, Docker containers, and Google Cloud Platform. Use it for deploying serverless functions, configuring edge computing solutions, managing containers, setting up CI/CD pipelines, and optimizing cloud costs. It's ideal for building cloud-native applications and implementing global caching strategies.

Documentation

DevOps Skill

Comprehensive guide for deploying and managing cloud infrastructure across Cloudflare edge platform, Docker containerization, and Google Cloud Platform.

When to Use This Skill

Use this skill when:

  • Deploying serverless applications to Cloudflare Workers
  • Containerizing applications with Docker
  • Managing Google Cloud infrastructure with gcloud CLI
  • Setting up CI/CD pipelines across platforms
  • Optimizing cloud infrastructure costs
  • Implementing multi-region deployments
  • Building edge-first architectures
  • Managing container orchestration with Kubernetes
  • Configuring cloud storage solutions (R2, Cloud Storage)
  • Automating infrastructure with scripts and IaC

Platform Selection Guide

When to Use Cloudflare

Best For:

  • Edge-first applications with global distribution
  • Ultra-low latency requirements (<50ms)
  • Static sites with serverless functions
  • Zero egress cost scenarios (R2 storage)
  • WebSocket/real-time applications (Durable Objects)
  • AI/ML at the edge (Workers AI)

Key Products:

  • Workers (serverless functions)
  • R2 (object storage, S3-compatible)
  • D1 (SQLite database with global replication)
  • KV (key-value store)
  • Pages (static hosting + functions)
  • Durable Objects (stateful compute)
  • Browser Rendering (headless browser automation)

Cost Profile: Pay-per-request, generous free tier, zero egress fees

When to Use Docker

Best For:

  • Local development consistency
  • Microservices architectures
  • Multi-language stack applications
  • Traditional VPS/VM deployments
  • Kubernetes orchestration
  • CI/CD build environments
  • Database containerization (dev/test)

Key Capabilities:

  • Application isolation and portability
  • Multi-stage builds for optimization
  • Docker Compose for multi-container apps
  • Volume management for data persistence
  • Network configuration and service discovery
  • Cross-platform compatibility (amd64, arm64)

Cost Profile: Infrastructure cost only (compute + storage)

When to Use Google Cloud

Best For:

  • Enterprise-scale applications
  • Data analytics and ML pipelines (BigQuery, Vertex AI)
  • Hybrid/multi-cloud deployments
  • Kubernetes at scale (GKE)
  • Managed databases (Cloud SQL, Firestore, Spanner)
  • Complex IAM and compliance requirements

Key Services:

  • Compute Engine (VMs)
  • GKE (managed Kubernetes)
  • Cloud Run (containerized serverless)
  • App Engine (PaaS)
  • Cloud Storage (object storage)
  • Cloud SQL (managed databases)

Cost Profile: Varied pricing, sustained use discounts, committed use contracts

Quick Start

Cloudflare Workers

# Install Wrangler CLI
npm install -g wrangler

# Create and deploy Worker
wrangler init my-worker
cd my-worker
wrangler deploy

See: references/cloudflare-workers-basics.md

Docker Container

# Create Dockerfile
cat > Dockerfile <<EOF
FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --production
COPY . .
EXPOSE 3000
CMD ["node", "server.js"]
EOF

# Build and run
docker build -t myapp .
docker run -p 3000:3000 myapp

See: references/docker-basics.md

Google Cloud Deployment

# Install and authenticate
curl https://sdk.cloud.google.com | bash
gcloud init
gcloud auth login

# Deploy to Cloud Run
gcloud run deploy my-service \
  --image gcr.io/project/image \
  --region us-central1

See: references/gcloud-platform.md

Reference Navigation

Cloudflare Platform

  • cloudflare-platform.md - Edge computing overview, key components
  • cloudflare-workers-basics.md - Getting started, handler types, basic patterns
  • cloudflare-workers-advanced.md - Advanced patterns, performance, optimization
  • cloudflare-workers-apis.md - Runtime APIs, bindings, integrations
  • cloudflare-r2-storage.md - R2 object storage, S3 compatibility, best practices
  • cloudflare-d1-kv.md - D1 SQLite database, KV store, use cases
  • browser-rendering.md - Puppeteer/Playwright automation on Cloudflare

Docker Containerization

  • docker-basics.md - Core concepts, Dockerfile, images, containers
  • docker-compose.md - Multi-container apps, networking, volumes

Google Cloud Platform

  • gcloud-platform.md - GCP overview, gcloud CLI, authentication
  • gcloud-services.md - Compute Engine, GKE, Cloud Run, App Engine

Python Utilities

  • scripts/cloudflare-deploy.py - Automate Cloudflare Worker deployments
  • scripts/docker-optimize.py - Analyze and optimize Dockerfiles

Common Workflows

Edge + Container Hybrid

# Cloudflare Workers (API Gateway)
# -> Docker containers on Cloud Run (Backend Services)
# -> R2 (Object Storage)

# Benefits:
# - Edge caching and routing
# - Containerized business logic
# - Global distribution

Multi-Stage Docker Build

# Build stage
FROM node:20-alpine AS build
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

# Production stage
FROM node:20-alpine
WORKDIR /app
COPY --from=build /app/dist ./dist
COPY --from=build /app/node_modules ./node_modules
USER node
CMD ["node", "dist/server.js"]

CI/CD Pipeline Pattern

# 1. Build: Docker multi-stage build
# 2. Test: Run tests in container
# 3. Push: Push to registry (GCR, Docker Hub)
# 4. Deploy: Deploy to Cloudflare Workers / Cloud Run
# 5. Verify: Health checks and smoke tests

Best Practices

Security

  • Run containers as non-root user
  • Use service account impersonation (GCP)
  • Store secrets in environment variables, not code
  • Scan images for vulnerabilities (Docker Scout)
  • Use API tokens with minimal permissions

Performance

  • Multi-stage Docker builds to reduce image size
  • Edge caching with Cloudflare KV
  • Use R2 for zero egress cost storage
  • Implement health checks for containers
  • Set appropriate timeouts and resource limits

Cost Optimization

  • Use Cloudflare R2 instead of S3 for large egress
  • Implement caching strategies (edge + KV)
  • Right-size container resources
  • Use sustained use discounts (GCP)
  • Monitor usage with cloud provider dashboards

Development

  • Use Docker Compose for local development
  • Wrangler dev for local Worker testing
  • Named gcloud configurations for multi-environment
  • Version control infrastructure code
  • Implement automated testing in CI/CD

Decision Matrix

NeedChoose
Sub-50ms latency globallyCloudflare Workers
Large file storage (zero egress)Cloudflare R2
SQL database (global reads)Cloudflare D1
Containerized workloadsDocker + Cloud Run/GKE
Enterprise KubernetesGKE
Managed relational DBCloud SQL
Static site + APICloudflare Pages
WebSocket/real-timeCloudflare Durable Objects
ML/AI pipelinesGCP Vertex AI
Browser automationCloudflare Browser Rendering

Resources

Implementation Checklist

Cloudflare Workers

  • Install Wrangler CLI
  • Create Worker project
  • Configure wrangler.toml (bindings, routes)
  • Test locally with wrangler dev
  • Deploy with wrangler deploy

Docker

  • Write Dockerfile with multi-stage builds
  • Create .dockerignore file
  • Test build locally
  • Push to registry
  • Deploy to target platform

Google Cloud

  • Install gcloud CLI
  • Authenticate with service account
  • Create project and enable APIs
  • Configure IAM permissions
  • Deploy and monitor resources

Quick Install

/plugin add https://github.com/mrgoonie/claudekit-skills/tree/main/devops

Copy and paste this command in Claude Code to install this skill

GitHub 仓库

mrgoonie/claudekit-skills
Path: .claude/skills/devops

Related Skills

sglang

Meta

SGLang is a high-performance LLM serving framework that specializes in fast, structured generation for JSON, regex, and agentic workflows using its RadixAttention prefix caching. It delivers significantly faster inference, especially for tasks with repeated prefixes, making it ideal for complex, structured outputs and multi-turn conversations. Choose SGLang over alternatives like vLLM when you need constrained decoding or are building applications with extensive prefix sharing.

View skill

evaluating-llms-harness

Testing

This Claude Skill runs the lm-evaluation-harness to benchmark LLMs across 60+ standardized academic tasks like MMLU and GSM8K. It's designed for developers to compare model quality, track training progress, or report academic results. The tool supports various backends including HuggingFace and vLLM models.

View skill

llamaguard

Other

LlamaGuard is Meta's 7-8B parameter model for moderating LLM inputs and outputs across six safety categories like violence and hate speech. It offers 94-95% accuracy and can be deployed using vLLM, Hugging Face, or Amazon SageMaker. Use this skill to easily integrate content filtering and safety guardrails into your AI applications.

View skill

langchain

Meta

LangChain is a framework for building LLM applications using agents, chains, and RAG pipelines. It supports multiple LLM providers, offers 500+ integrations, and includes features like tool calling and memory management. Use it for rapid prototyping and deploying production systems like chatbots, autonomous agents, and question-answering services.

View skill