moai-baas-firebase-ext
About
This enterprise Firebase platform skill provides AI-powered Google Cloud integration and intelligent Firebase orchestration for scalable applications. It includes Context7 integration and supports core Firebase services like Firestore, Authentication, and Cloud Functions. Use this skill when you need advanced Firebase management with AI-driven architecture and enterprise-grade scalability.
Quick Install
Claude Code
Recommended/plugin add https://github.com/modu-ai/moai-adkgit clone https://github.com/modu-ai/moai-adk.git ~/.claude/skills/moai-baas-firebase-extCopy and paste this command in Claude Code to install this skill
Documentation
Enterprise Firebase Platform Expert v4.0.0
Skill Metadata
| Field | Value |
|---|---|
| Skill Name | moai-baas-firebase-ext |
| Version | 4.0.0 (2025-11-13) |
| Tier | Enterprise Firebase Platform Expert |
| AI-Powered | β Context7 Integration, Intelligent Architecture |
| Auto-load | On demand when Firebase keywords detected |
What It Does
Enterprise Firebase Platform expert with AI-powered Google Cloud integration, Context7 integration, and intelligent Firebase orchestration for scalable mobile and web applications.
Revolutionary v4.0.0 capabilities:
- π€ AI-Powered Firebase Architecture using Context7 MCP for latest Firebase patterns
- π Intelligent Google Cloud Integration with automated optimization and scaling
- π Advanced Real-Time Features with AI-driven synchronization and performance
- π Enterprise Mobile Backend with zero-configuration deployment integration
- π Predictive Analytics Integration with usage forecasting and cost optimization
When to Use
Automatic triggers:
- Firebase architecture and Google Cloud integration discussions
- Real-time database and Firestore implementation planning
- Mobile backend as a service strategy development
- Firebase security and performance optimization
Manual invocation:
- Designing enterprise Firebase architectures with optimal patterns
- Implementing comprehensive real-time features and synchronization
- Planning Firebase to Google Cloud migration strategies
- Optimizing Firebase performance and cost management
Quick Reference (Level 1)
Firebase Platform Ecosystem (November 2025)
Core Firebase Services
- Firestore: NoSQL document database with real-time synchronization
- Realtime Database: Legacy real-time JSON database
- Authentication: Multi-provider auth with social and enterprise support
- Cloud Functions: Serverless backend functions with auto-scaling
- Firebase Hosting: Global web hosting with CDN and SSL
- Cloud Storage: Object storage with security and metadata
Latest Features (November 2025)
- Firebase Data Connect: GraphQL APIs with database integration
- Native Vector Search: Built-in vector similarity search in Firestore
- Dataflow Integration: Advanced analytics and data processing
- Materialized Views: Query optimization with materialized views
- Point-in-Time Recovery: 7-day retention for database backup
Google Cloud Integration
- BigQuery: Firebase analytics integration for business intelligence
- Cloud Run: Scalable container deployment for Firebase extensions
- Cloud Logging: Comprehensive logging and monitoring
- Cloud Monitoring: Performance metrics and alerting
- Cloud Build: CI/CD pipeline integration
Performance Characteristics
- Firestore: P95 < 100ms query latency
- Realtime Database: P95 < 150ms sync latency
- Cloud Functions: Sub-second cold starts, 1M+ concurrent
- Firebase Hosting: Edge deployment with < 50ms global latency
- Cloud Storage: 99.999% durability, global CDN
Core Implementation (Level 2)
Firebase Architecture Intelligence
# AI-powered Firebase architecture optimization with Context7
class FirebaseArchitectOptimizer:
def __init__(self):
self.context7_client = Context7Client()
self.firestore_analyzer = FirestoreAnalyzer()
self.realtime_optimizer = RealtimeOptimizer()
async def design_optimal_firebase_architecture(self,
requirements: ApplicationRequirements) -> FirebaseArchitecture:
"""Design optimal Firebase architecture using AI analysis."""
# Get latest Firebase and Google Cloud documentation via Context7
firebase_docs = await self.context7_client.get_library_docs(
context7_library_id='/firebase/docs',
topic="firestore realtime authentication cloud-functions 2025",
tokens=3000
)
gcloud_docs = await self.context7_client.get_library_docs(
context7_library_id='/google-cloud/docs',
topic="bigquery cloud-run monitoring optimization 2025",
tokens=2000
)
# Optimize database strategy
database_strategy = self.firestore_analyzer.optimize_database_strategy(
requirements.data_requirements,
requirements.realtime_needs,
firebase_docs
)
# Optimize real-time architecture
realtime_design = self.realtime_optimizer.design_realtime_system(
requirements.realtime_needs,
requirements.user_base_size,
firebase_docs
)
return FirebaseArchitecture(
database_configuration=database_strategy,
realtime_system=realtime_design,
authentication_setup=self._configure_authentication(requirements),
cloud_functions=self._design_cloud_functions(requirements),
google_cloud_integration=self._integrate_google_cloud(requirements),
performance_optimization=self._optimize_performance(requirements),
cost_management=self._optimize_costs(requirements)
)
Advanced Firebase Implementation
// Enterprise Firebase implementation with TypeScript
import { initializeApp, cert, getApps, getApp } from 'firebase-admin/app';
import { getFirestore, Firestore, collection, doc, setDoc, getDoc, updateDoc, query, where, orderBy, limit, onSnapshot } from 'firebase-admin/firestore';
import { getAuth, Auth } from 'firebase-admin/auth';
import { getFunctions, Functions } from 'firebase-admin/functions';
import { getStorage, Storage } from 'firebase-admin/storage';
interface FirebaseConfig {
projectId: string;
clientEmail: string;
privateKey: string;
databaseURL: string;
storageBucket: string;
}
export class EnterpriseFirebaseManager {
private app: any;
private firestore: Firestore;
private auth: Auth;
private functions: Functions;
private storage: Storage;
constructor(config: FirebaseConfig) {
// Initialize Firebase Admin SDK
this.app = !getApps().length ? initializeApp({
credential: cert({
projectId: config.projectId,
clientEmail: config.clientEmail,
privateKey: config.privateKey.replace(/\\n/g, '\n'),
}),
databaseURL: config.databaseURL,
storageBucket: config.storageBucket,
}) : getApp();
this.firestore = getFirestore(this.app);
this.auth = getAuth(this.app);
this.functions = getFunctions(this.app);
this.storage = getStorage(this.app);
}
// Advanced Firestore operations with batch processing
async batchUpdateDocuments(
updates: Array<{ collection: string; docId: string; data: any }>
): Promise<void> {
const batch = this.firestore.batch();
for (const update of updates) {
const docRef = doc(this.firestore, update.collection, update.docId);
batch.set(docRef, {
...update.data,
updatedAt: new Date(),
updatedBy: 'system',
}, { merge: true });
}
await batch.commit();
}
// Real-time subscription with advanced filtering
subscribeToRealtimeUpdates<T>(
collectionPath: string,
filters: QueryFilter[] = [],
callback: (data: T[]) => void
): () => void {
let queryRef = collection(this.firestore, collectionPath);
// Apply filters
for (const filter of filters) {
if (filter.type === 'where') {
queryRef = query(queryRef, where(filter.field, filter.operator, filter.value));
} else if (filter.type === 'orderBy') {
queryRef = query(queryRef, orderBy(filter.field, filter.direction));
} else if (filter.type === 'limit') {
queryRef = query(queryRef, limit(filter.value));
}
}
const unsubscribe = onSnapshot(
queryRef,
(snapshot) => {
const data: T[] = [];
snapshot.forEach((doc) => {
data.push({ id: doc.id, ...doc.data() } as T);
});
callback(data);
},
(error) => {
console.error('Real-time subscription error:', error);
}
);
return unsubscribe;
}
// Advanced user authentication with custom claims
async authenticateUser(
uid: string,
customClaims: Record<string, any> = {}
): Promise<AuthResult> {
try {
// Set custom claims
await this.auth.setCustomUserClaims(uid, customClaims);
// Get user record
const userRecord = await this.auth.getUser(uid);
return {
success: true,
user: {
uid: userRecord.uid,
email: userRecord.email,
displayName: userRecord.displayName,
photoURL: userRecord.photoURL,
emailVerified: userRecord.emailVerified,
customClaims: userRecord.customClaims,
},
};
} catch (error) {
return {
success: false,
error: error.message,
};
}
}
// Secure file upload with metadata
async uploadFile(
filePath: string,
fileData: Buffer,
metadata: FileMetadata
): Promise<FileUploadResult> {
try {
const bucket = this.storage.bucket();
const file = bucket.file(filePath);
// Upload file with metadata
await file.save(fileData, {
metadata: {
contentType: metadata.contentType,
metadata: {
uploadedBy: metadata.uploadedBy,
originalName: metadata.originalName,
description: metadata.description,
tags: JSON.stringify(metadata.tags || []),
},
},
});
// Make file public if needed
if (metadata.makePublic) {
await file.makePublic();
}
return {
success: true,
filePath,
publicUrl: metadata.makePublic ? file.publicUrl() : null,
size: fileData.length,
contentType: metadata.contentType,
};
} catch (error) {
return {
success: false,
error: error.message,
};
}
}
// Cloud Functions with advanced error handling
async callFunction(
functionName: string,
data: any,
timeout: number = 54000 // 54 seconds default
): Promise<FunctionResult> {
try {
const functionRef = this.functions.httpsCallable(functionName);
const result = await functionRef(data);
return {
success: true,
data: result.data,
};
} catch (error) {
return {
success: false,
error: error.message,
code: error.code,
details: error.details,
};
}
}
}
// Real-time data synchronization manager
export class RealtimeSyncManager {
private firebaseManager: EnterpriseFirebaseManager;
private syncSubscriptions: Map<string, () => void> = new Map();
constructor(firebaseManager: EnterpriseFirebaseManager) {
this.firebaseManager = firebaseManager;
}
// Sync user data across multiple devices
syncUserData(userId: string, callback: (userData: UserData) => void): () => void {
const unsubscribe = this.firebaseManager.subscribeToRealtimeUpdates<UserData>(
`users/${userId}`,
[
{ type: 'orderBy', field: 'updatedAt', direction: 'desc' },
{ type: 'limit', value: 1 },
],
(data) => {
if (data.length > 0) {
callback(data[0]);
}
}
);
this.syncSubscriptions.set(`userData-${userId}`, unsubscribe);
return unsubscribe;
}
// Sync collaborative data for real-time collaboration
syncCollaborativeData(
documentId: string,
callback: (data: CollaborativeData) => void
): () => void {
const unsubscribe = this.firebaseManager.subscribeToRealtimeUpdates<CollaborativeData>(
`collaborative/${documentId}`,
[],
callback
);
this.syncSubscriptions.set(`collaborative-${documentId}`, unsubscribe);
return unsubscribe;
}
// Cancel all subscriptions
cancelAllSubscriptions(): void {
for (const unsubscribe of this.syncSubscriptions.values()) {
unsubscribe();
}
this.syncSubscriptions.clear();
}
}
// Advanced query optimization for Firestore
export class FirestoreQueryOptimizer {
private firestore: Firestore;
constructor(firestore: Firestore) {
this.firestore = firestore;
}
// Optimized pagination with cursor-based pagination
async paginateWithCursor<T>(
collectionPath: string,
pageSize: number = 20,
startAfter?: string,
orderBy: string = 'createdAt'
): Promise<PaginatedResult<T>> {
let queryRef = collection(this.firestore, collectionPath);
queryRef = query(queryRef, orderBy(orderBy, 'desc'));
queryRef = query(queryRef, limit(pageSize + 1));
if (startAfter) {
const startDoc = await getDoc(doc(this.firestore, collectionPath, startAfter));
queryRef = query(queryRef, startAfter(startDoc));
}
const snapshot = await getDocs(queryRef);
const documents = snapshot.docs.map(doc => ({
id: doc.id,
...doc.data(),
} as T));
const hasNext = documents.length > pageSize;
const data = hasNext ? documents.slice(0, -1) : documents;
return {
data,
hasNext,
nextCursor: hasNext ? documents[documents.length - 1].id : null,
};
}
// Batch operations with atomic transactions
async executeTransaction<T>(
operations: TransactionOperation[]
): Promise<T[]> {
const batch = this.firestore.batch();
for (const operation of operations) {
const docRef = doc(this.firestore, operation.collection, operation.docId);
switch (operation.type) {
case 'set':
batch.set(docRef, operation.data, operation.options);
break;
case 'update':
batch.update(docRef, operation.data);
break;
case 'delete':
batch.delete(docRef);
break;
}
}
await batch.commit();
return operations.map(op => op.data as T);
}
// Composite queries for complex data retrieval
async executeCompositeQuery<T>(
queries: CompositeQuery[]
): Promise<CompositeQueryResult<T>> {
const results = await Promise.all(
queries.map(async (query) => {
let queryRef = collection(this.firestore, query.collection);
for (const filter of query.filters) {
queryRef = query(queryRef, where(filter.field, filter.operator, filter.value));
}
const snapshot = await getDocs(queryRef);
return {
key: query.key,
data: snapshot.docs.map(doc => ({
id: doc.id,
...doc.data(),
} as T)),
};
})
);
return {
results,
totalDocuments: results.reduce((sum, result) => sum + result.data.length, 0),
};
}
}
// Types
interface QueryFilter {
type: 'where' | 'orderBy' | 'limit';
field: string;
operator?: '==' | '!=' | '>' | '>=' | '<' | '<=' | 'array-contains' | 'in';
value?: any;
direction?: 'asc' | 'desc';
}
interface AuthResult {
success: boolean;
user?: {
uid: string;
email: string;
displayName: string;
photoURL: string;
emailVerified: boolean;
customClaims: Record<string, any>;
};
error?: string;
}
interface FileMetadata {
contentType: string;
uploadedBy: string;
originalName: string;
description?: string;
tags?: string[];
makePublic?: boolean;
}
interface FileUploadResult {
success: boolean;
filePath: string;
publicUrl?: string;
size: number;
contentType: string;
error?: string;
}
interface FunctionResult {
success: boolean;
data?: any;
error?: string;
code?: string;
details?: any;
}
interface UserData {
uid: string;
email: string;
displayName: string;
preferences: Record<string, any>;
lastActive: Date;
}
interface CollaborativeData {
documentId: string;
content: any;
collaborators: string[];
lastModified: Date;
modifiedBy: string;
}
interface PaginatedResult<T> {
data: T[];
hasNext: boolean;
nextCursor: string | null;
}
interface TransactionOperation {
type: 'set' | 'update' | 'delete';
collection: string;
docId: string;
data?: any;
options?: { merge?: boolean };
}
interface CompositeQuery {
key: string;
collection: string;
filters: QueryFilter[];
}
interface CompositeQueryResult<T> {
results: Array<{ key: string; data: T[] }>;
totalDocuments: number;
}
Cloud Functions Integration
# Advanced Cloud Functions with Python
from firebase_functions import https_fn, firestore_fn, auth_fn, storage_fn
from firebase_admin import firestore, auth, storage
from google.cloud import pubsub_v1
from datetime import datetime, timedelta
import json
# Real-time data synchronization function
@https_fn.on_request()
def sync_realtime_data(request: https_fn.Request) -> https_fn.Response:
"""Handle real-time data synchronization requests."""
try:
# Parse request data
data = request.get_json()
# Validate request
if not data or 'collection' not in data or 'document' not in data:
return https_fn.Response(
json.dumps({"error": "Missing required fields"}),
status=400,
mimetype="application/json"
)
# Get Firestore client
db = firestore.client()
# Get document reference
doc_ref = db.collection(data['collection']).document(data['document'])
# Update document with timestamp
doc_ref.set({
'data': data.get('data', {}),
'updated_at': datetime.utcnow(),
'sync_source': data.get('source', 'unknown'),
}, merge=True)
# Trigger real-time update notification
pubsub_client = pubsub_v1.PublisherClient()
topic_path = pubsub_client.topic_path(
os.environ.get('GCP_PROJECT', 'default-project'),
'realtime-updates'
)
pubsub_client.publish(
topic_path,
data=json.dumps({
'collection': data['collection'],
'document': data['document'],
'timestamp': datetime.utcnow().isoformat(),
}).encode('utf-8')
)
return https_fn.Response(
json.dumps({"success": True, "message": "Data synchronized successfully"}),
status=200,
mimetype="application/json"
)
except Exception as e:
return https_fn.Response(
json.dumps({"error": str(e)}),
status=500,
mimetype="application/json"
)
# User management function
@auth_fn.on_user_created
def new_user_created(user: auth_fn.AuthEvent) -> None:
"""Handle new user creation."""
try:
# Get Firestore client
db = firestore.client()
# Create user profile
db.collection('users').document(user.uid).set({
'email': user.email,
'display_name': user.display_name,
'photo_url': user.photo_url,
'email_verified': user.email_verified,
'created_at': datetime.utcnow(),
'last_login': datetime.utcnow(),
'preferences': {
'notifications': True,
'theme': 'light',
'language': 'en',
},
'subscription_tier': 'free',
})
# Create initial user statistics
db.collection('user_stats').document(user.uid).set({
'documents_created': 0,
'collaborations': 0,
'last_activity': datetime.utcnow(),
})
except Exception as e:
print(f"Error creating user profile: {e}")
# Storage function for file processing
@storage_fn.on_object_finalized()
def process_uploaded_file(event: storage_fn.CloudEvent) -> None:
"""Process uploaded files and extract metadata."""
try:
# Get file information
file_path = event.data.name
bucket_name = event.data.bucket
# Get storage client
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(file_path)
# Extract metadata
metadata = blob.metadata or {}
# Get Firestore client
db = firestore.client()
# Create file record
db.collection('files').document(blob.name).set({
'name': blob.name,
'content_type': blob.content_type,
'size': blob.size,
'created': blob.time_created,
'updated': blob.updated,
'metadata': metadata,
'public_url': blob.public_url,
'processed': True,
})
# If image, generate thumbnail
if blob.content_type.startswith('image/'):
# Call image processing function
generate_thumbnail(blob.name)
except Exception as e:
print(f"Error processing file {event.data.name}: {e}")
def generate_thumbnail(file_path: str):
"""Generate thumbnail for uploaded images."""
try:
# This would integrate with Cloud Vision API or similar
# For now, just update the file record
db = firestore.client()
db.collection('files').document(file_path).update({
'thumbnail_generated': True,
'thumbnail_url': f"https://storage.googleapis.com/thumbnails/{file_path}",
})
except Exception as e:
print(f"Error generating thumbnail for {file_path}: {e}")
# Automated backup function
@https_fn.on_request(schedule="0 2 * * *") # Daily at 2 AM
def automated_backup(request: https_fn.Request) -> https_fn.Response:
"""Perform automated database backup."""
try:
# Get Firestore client
db = firestore.client()
# Get backup configuration
backup_config = db.collection('config').document('backup').get().to_dict()
if not backup_config or not backup_config.get('enabled', False):
return https_fn.Response("Backup disabled", status=200)
# Create backup record
backup_ref = db.collection('backups').document()
backup_ref.set({
'created_at': datetime.utcnow(),
'status': 'in_progress',
'type': 'automated',
'config': backup_config,
})
# Perform backup (simplified version)
collections_to_backup = backup_config.get('collections', [])
backup_data = {}
for collection_name in collections_to_backup:
collection_ref = db.collection(collection_name)
docs = collection_ref.stream()
backup_data[collection_name] = [
{**doc.to_dict(), 'id': doc.id} for doc in docs
]
# Store backup in Cloud Storage
storage_client = storage.Client()
bucket = storage_client.bucket(backup_config['storage_bucket'])
backup_blob = bucket.blob(f"backups/{backup_ref.id}.json")
backup_blob.upload_from_string(
json.dumps(backup_data, default=str),
content_type='application/json'
)
# Update backup record
backup_ref.update({
'status': 'completed',
'completed_at': datetime.utcnow(),
'storage_path': backup_blob.name,
'document_count': sum(len(docs) for docs in backup_data.values()),
})
# Clean old backups
clean_old_backups(backup_config['retention_days'])
return https_fn.Response(
json.dumps({
"success": True,
"backup_id": backup_ref.id,
"document_count": sum(len(docs) for docs in backup_data.values())
}),
status=200,
mimetype="application/json"
)
except Exception as e:
return https_fn.Response(
json.dumps({"error": str(e)}),
status=500,
mimetype="application/json"
)
def clean_old_backups(retention_days: int):
"""Clean old backup files."""
try:
cutoff_date = datetime.utcnow() - timedelta(days=retention_days)
db = firestore.client()
old_backups = db.collection('backups').where(
'created_at', '<', cutoff_date
).stream()
storage_client = storage.Client()
bucket = storage_client.bucket(os.environ.get('BACKUP_BUCKET'))
for backup in old_backups:
# Delete from Cloud Storage
backup_path = backup.to_dict().get('storage_path')
if backup_path:
blob = bucket.blob(backup_path)
blob.delete()
# Delete from Firestore
db.collection('backups').document(backup.id).delete()
except Exception as e:
print(f"Error cleaning old backups: {e}")
Reference & Integration (Level 4)
API Reference
Core Firebase Operations
batch_update_documents(updates)- Atomic batch document updatessubscribe_to_realtime_updates(collection, filters, callback)- Real-time subscriptionsauthenticate_user(uid, customClaims)- User authentication with claimsupload_file(filePath, data, metadata)- Secure file uploadcall_function(functionName, data)- Cloud Functions invocation
Context7 Integration
get_latest_firebase_documentation()- Firebase docs via Context7analyze_firestore_patterns()- Database patterns via Context7optimize_realtime_architecture()- Real-time optimization via Context7
Best Practices (November 2025)
DO
- Use Firestore for new projects over Realtime Database
- Implement proper security rules for database access
- Use Cloud Functions for serverless backend logic
- Optimize database queries with proper indexing
- Implement proper error handling and retry logic
- Use Firebase Authentication with custom claims
- Monitor performance and costs regularly
- Implement proper backup and recovery procedures
DON'T
- Use Realtime Database for new projects
- Skip security rules validation
- Ignore database performance optimization
- Forget to implement proper error handling
- Skip Firebase Security Rules testing
- Ignore cost monitoring and optimization
- Forget to implement proper logging
- Skip backup and disaster recovery planning
Works Well With
moai-baas-foundation(Enterprise BaaS architecture)moai-domain-backend(Backend Firebase integration)moai-domain-frontend(Frontend Firebase integration)moai-security-api(Firebase security implementation)moai-essentials-perf(Performance optimization)moai-foundation-trust(Security and compliance)moai-domain-mobile-app(Mobile Firebase integration)moai-baas-supabase-ext(PostgreSQL alternative)
Changelog
- v4.0.0 (2025-11-13): Complete Enterprise v4.0 rewrite with 40% content reduction, 4-layer Progressive Disclosure structure, Context7 integration, November 2025 Firebase platform updates, and advanced Google Cloud integration
- v2.0.0 (2025-11-11): Complete metadata structure, Firebase patterns, Google Cloud integration
- v1.0.0 (2025-11-11): Initial Firebase platform
End of Skill | Updated 2025-11-13
Firebase Platform Integration
Google Cloud Ecosystem
- BigQuery integration for advanced analytics
- Cloud Run for scalable container deployment
- Cloud Logging and Monitoring for observability
- Cloud Build for CI/CD pipeline integration
- Cloud Storage for scalable file management
Mobile-First Features
- Real-time synchronization across devices
- Offline data persistence and sync
- Push notifications and messaging
- Authentication with social providers
- Global CDN for fast content delivery
End of Enterprise Firebase Platform Expert v4.0.0
GitHub Repository
Related Skills
sglang
MetaSGLang is a high-performance LLM serving framework that specializes in fast, structured generation for JSON, regex, and agentic workflows using its RadixAttention prefix caching. It delivers significantly faster inference, especially for tasks with repeated prefixes, making it ideal for complex, structured outputs and multi-turn conversations. Choose SGLang over alternatives like vLLM when you need constrained decoding or are building applications with extensive prefix sharing.
evaluating-llms-harness
TestingThis Claude Skill runs the lm-evaluation-harness to benchmark LLMs across 60+ standardized academic tasks like MMLU and GSM8K. It's designed for developers to compare model quality, track training progress, or report academic results. The tool supports various backends including HuggingFace and vLLM models.
llamaguard
OtherLlamaGuard is Meta's 7-8B parameter model for moderating LLM inputs and outputs across six safety categories like violence and hate speech. It offers 94-95% accuracy and can be deployed using vLLM, Hugging Face, or Amazon SageMaker. Use this skill to easily integrate content filtering and safety guardrails into your AI applications.
langchain
MetaLangChain is a framework for building LLM applications using agents, chains, and RAG pipelines. It supports multiple LLM providers, offers 500+ integrations, and includes features like tool calling and memory management. Use it for rapid prototyping and deploying production systems like chatbots, autonomous agents, and question-answering services.
