Low-Code Integration Patterns
MangaAssist context: JP Manga store chatbot on AWS — Bedrock Claude 3 (Sonnet at $3/$15 per 1M tokens input/output, Haiku at $0.25/$1.25), OpenSearch Serverless (vector store), DynamoDB (sessions/products), ECS Fargate (orchestrator), API Gateway WebSocket, ElastiCache Redis. Target: useful answer in under 3 seconds, 1M messages/day scale.
Skill Mapping
| Field | Value |
|---|---|
| Domain | 2 — Implementation & Integration |
| Task | 2.5 — Application Integration Patterns |
| Skill | 2.5.2 — Accessible AI Interfaces |
| Focus | Amplify configuration, OpenAPI code generation, Prompt Flow definitions |
| MangaAssist Relevance | Low-code patterns for frontend deployment, partner SDK generation, and no-code prompt iteration |
Mind Map
mindmap
root((Low-Code Integration Patterns))
Amplify Configuration
amplify.yml Build Spec
Install dependencies
Build React app
Output artifacts
Environment Variables
API endpoints
Cognito config
Feature flags
Custom Headers
Security headers
CORS configuration
Cache-Control
Branch-Based Deploys
Main → Production
Develop → Staging
PR → Preview
OpenAPI Code Generation
SDK Generation Pipeline
openapi-generator-cli
TypeScript client
Python client
API Gateway Import
Request validation
Model mapping
Stage variables
Contract Testing
Schemathesis fuzzing
Postman collections
CI integration
Prompt Flow Definitions
Flow JSON Schema
Input/output nodes
Prompt templates
Condition expressions
Variable Binding
Session variables
Environment config
Dynamic prompts
Testing Flows
Local simulation
A/B comparison
Metric collection
Integration Accelerators
Amplify Libraries
API category
Auth category
Analytics category
CDK Constructs
L3 chat construct
WebSocket + Lambda
Cognito + APIGW
Reusable Patterns
Chat widget embed
Webhook integration
Event-driven flows
Amplify Configuration
Build and Deploy Configuration
The Amplify build specification controls how MangaAssist's React frontend is built and deployed. Environment-specific configuration ensures the same codebase targets different backends (staging, production).
graph TB
subgraph Git["Source Control"]
MAIN[main branch]
DEV[develop branch]
PR[Pull Request]
end
subgraph Amplify["Amplify Hosting"]
BUILD_MAIN[Build: Production<br/>npm run build]
BUILD_DEV[Build: Staging<br/>npm run build:staging]
BUILD_PR[Build: Preview<br/>npm run build:preview]
end
subgraph Deploy["Deployment"]
PROD_CF[CloudFront<br/>manga-assist.example.com]
STAGING_CF[CloudFront<br/>staging.manga-assist.example.com]
PREVIEW_CF[CloudFront<br/>pr-123.manga-assist.example.com]
end
MAIN -->|auto trigger| BUILD_MAIN
DEV -->|auto trigger| BUILD_DEV
PR -->|auto trigger| BUILD_PR
BUILD_MAIN --> PROD_CF
BUILD_DEV --> STAGING_CF
BUILD_PR --> PREVIEW_CF
style PROD_CF fill:#28a745,color:#fff
style STAGING_CF fill:#ffc107,color:#000
style PREVIEW_CF fill:#17a2b8,color:#fff
Amplify Build Specification
"""
MangaAssist Amplify Configuration Generator
Generates amplify.yml and environment configuration for multi-stage deployment.
"""
import json
import yaml
from dataclasses import dataclass, field
from typing import Optional
@dataclass
class AmplifyEnvironment:
"""Environment-specific configuration."""
name: str
ws_endpoint: str
rest_endpoint: str
cognito_pool_id: str
cognito_client_id: str
cognito_domain: str
analytics_id: Optional[str] = None
feature_flags: dict = field(default_factory=dict)
# MangaAssist environments
ENVIRONMENTS = {
"production": AmplifyEnvironment(
name="production",
ws_endpoint="wss://ws.manga-assist.example.com",
rest_endpoint="https://api.manga-assist.example.com/v2",
cognito_pool_id="ap-northeast-1_XXXXXXXXX",
cognito_client_id="xxxxxxxxxxxxxxxxxxxxxxxxxx",
cognito_domain="manga-assist.auth.ap-northeast-1.amazoncognito.com",
analytics_id="xxxxxxxxxx",
feature_flags={
"enableVoiceInput": False,
"enableImageSearch": True,
"maxHistoryDisplay": 50,
},
),
"staging": AmplifyEnvironment(
name="staging",
ws_endpoint="wss://ws-staging.manga-assist.example.com",
rest_endpoint="https://api-staging.manga-assist.example.com/v2",
cognito_pool_id="ap-northeast-1_YYYYYYYYY",
cognito_client_id="yyyyyyyyyyyyyyyyyyyyyyyyyy",
cognito_domain="manga-assist-staging.auth.ap-northeast-1.amazoncognito.com",
feature_flags={
"enableVoiceInput": True,
"enableImageSearch": True,
"maxHistoryDisplay": 100,
"debugMode": True,
},
),
}
def generate_amplify_yml() -> str:
"""Generate the amplify.yml build specification."""
spec = {
"version": 1,
"applications": [
{
"frontend": {
"phases": {
"preBuild": {
"commands": [
"npm ci --cache .npm --prefer-offline",
"echo 'REACT_APP_BUILD_TIME='$(date -u +%Y-%m-%dT%H:%M:%SZ) >> .env",
"echo 'REACT_APP_GIT_SHA='$(git rev-parse --short HEAD) >> .env",
]
},
"build": {
"commands": [
"npm run build",
]
},
"postBuild": {
"commands": [
"npm run test:lighthouse || true",
]
},
},
"artifacts": {
"baseDirectory": "build",
"files": ["**/*"],
},
"cache": {
"paths": [
"node_modules/**/*",
".npm/**/*",
],
},
},
"test": {
"phases": {
"preTest": {
"commands": ["npm ci"],
},
"test": {
"commands": [
"npm run test -- --coverage --watchAll=false",
"npm run test:e2e || true",
],
},
},
"artifacts": {
"baseDirectory": "coverage",
"files": ["**/*"],
},
},
}
],
}
return yaml.dump(spec, default_flow_style=False, allow_unicode=True)
def generate_amplify_config(env_name: str) -> dict:
"""
Generate Amplify client-side configuration.
This is the aws-exports.js equivalent for MangaAssist.
"""
env = ENVIRONMENTS.get(env_name, ENVIRONMENTS["staging"])
return {
"Auth": {
"Cognito": {
"userPoolId": env.cognito_pool_id,
"userPoolClientId": env.cognito_client_id,
"loginWith": {
"oauth": {
"domain": env.cognito_domain,
"scopes": ["openid", "profile", "email"],
"redirectSignIn": [
f"https://{'staging.' if env_name == 'staging' else ''}manga-assist.example.com/callback"
],
"redirectSignOut": [
f"https://{'staging.' if env_name == 'staging' else ''}manga-assist.example.com/"
],
"responseType": "code",
"providers": ["LINE", "Google"],
},
},
},
},
"API": {
"REST": {
"MangaAssistAPI": {
"endpoint": env.rest_endpoint,
"region": "ap-northeast-1",
},
},
"WebSocket": {
"MangaAssistChat": {
"endpoint": env.ws_endpoint,
},
},
},
"Analytics": {
"Pinpoint": {
"appId": env.analytics_id or "",
"region": "ap-northeast-1",
},
} if env.analytics_id else {},
"FeatureFlags": env.feature_flags,
}
def generate_custom_headers() -> dict:
"""Generate Amplify custom headers for security."""
return {
"customHeaders": [
{
"pattern": "**/*",
"headers": [
{
"key": "Strict-Transport-Security",
"value": "max-age=31536000; includeSubDomains",
},
{
"key": "X-Content-Type-Options",
"value": "nosniff",
},
{
"key": "X-Frame-Options",
"value": "DENY",
},
{
"key": "X-XSS-Protection",
"value": "1; mode=block",
},
{
"key": "Content-Security-Policy",
"value": (
"default-src 'self'; "
"script-src 'self'; "
"style-src 'self' 'unsafe-inline'; "
"img-src 'self' data: https:; "
"connect-src 'self' wss://*.manga-assist.example.com "
"https://*.manga-assist.example.com "
"https://cognito-idp.ap-northeast-1.amazonaws.com; "
"font-src 'self' data:;"
),
},
{
"key": "Referrer-Policy",
"value": "strict-origin-when-cross-origin",
},
{
"key": "Permissions-Policy",
"value": "camera=(), microphone=(), geolocation=()",
},
],
},
{
"pattern": "/static/**",
"headers": [
{
"key": "Cache-Control",
"value": "public, max-age=31536000, immutable",
},
],
},
{
"pattern": "/index.html",
"headers": [
{
"key": "Cache-Control",
"value": "public, max-age=0, must-revalidate",
},
],
},
]
}
OpenAPI Code Generation Pipeline
SDK Generation Architecture
graph TB
subgraph Source["API Source of Truth"]
SPEC[OpenAPI 3.1 Spec<br/>manga-assist-api.yaml]
end
subgraph Pipeline["Generation Pipeline"]
VALIDATE[Validate Spec<br/>spectral lint]
DIFF[Detect Breaking Changes<br/>oasdiff]
GEN_TS[Generate TypeScript SDK<br/>openapi-generator]
GEN_PY[Generate Python SDK<br/>openapi-generator]
GEN_DOCS[Generate API Docs<br/>redoc-cli]
end
subgraph Output["Generated Artifacts"]
TS_SDK[TypeScript Client<br/>@manga-assist/api-client]
PY_SDK[Python Client<br/>manga-assist-client]
API_DOCS[API Documentation<br/>Static HTML]
APIGW_IMPORT[API Gateway Import<br/>REST API definition]
end
subgraph CI["CI/CD"]
NPM[npm publish<br/>Private registry]
PYPI[pip publish<br/>CodeArtifact]
S3[S3 Static Docs<br/>CloudFront]
end
SPEC --> VALIDATE
VALIDATE --> DIFF
DIFF -->|No breaking| GEN_TS
DIFF -->|No breaking| GEN_PY
DIFF -->|No breaking| GEN_DOCS
DIFF -->|Breaking change| ALERT[Alert API Team<br/>Review required]
GEN_TS --> TS_SDK
GEN_PY --> PY_SDK
GEN_DOCS --> API_DOCS
SPEC --> APIGW_IMPORT
TS_SDK --> NPM
PY_SDK --> PYPI
API_DOCS --> S3
style SPEC fill:#85ea2d,color:#000
style ALERT fill:#dc3545,color:#fff
SDK Generation Script
"""
MangaAssist OpenAPI SDK Generation Pipeline
Automates TypeScript and Python client generation from the OpenAPI spec.
"""
import json
import subprocess
import shutil
import logging
from pathlib import Path
from dataclasses import dataclass
from typing import Optional
logger = logging.getLogger(__name__)
@dataclass
class SDKConfig:
"""Configuration for SDK generation."""
spec_path: str = "api/manga-assist-api.yaml"
output_base: str = "generated/sdks"
typescript_package_name: str = "@manga-assist/api-client"
typescript_version: str = "2.0.0"
python_package_name: str = "manga_assist_client"
python_version: str = "2.0.0"
generator_version: str = "7.2.0"
class SDKGenerator:
"""Generate typed API clients from OpenAPI spec."""
def __init__(self, config: Optional[SDKConfig] = None):
self.config = config or SDKConfig()
def validate_spec(self) -> bool:
"""Validate OpenAPI spec with Spectral linter."""
result = subprocess.run(
["npx", "spectral", "lint", self.config.spec_path],
capture_output=True,
text=True,
)
if result.returncode != 0:
logger.error(f"Spec validation failed:\n{result.stdout}")
return False
logger.info("OpenAPI spec validation passed")
return True
def check_breaking_changes(self, previous_spec: str) -> dict:
"""
Compare current spec against previous version for breaking changes.
Uses oasdiff to detect backwards-incompatible changes.
"""
result = subprocess.run(
[
"oasdiff", "breaking",
"--base", previous_spec,
"--revision", self.config.spec_path,
"--format", "json",
],
capture_output=True,
text=True,
)
if result.returncode != 0 and result.stdout:
changes = json.loads(result.stdout)
logger.warning(
f"Breaking changes detected: {len(changes)} issues"
)
return {"breaking": True, "changes": changes}
return {"breaking": False, "changes": []}
def generate_typescript_client(self) -> str:
"""Generate TypeScript client SDK."""
output_dir = f"{self.config.output_base}/typescript"
shutil.rmtree(output_dir, ignore_errors=True)
cmd = [
"npx", f"@openapitools/openapi-generator-cli@{self.config.generator_version}",
"generate",
"-i", self.config.spec_path,
"-g", "typescript-fetch",
"-o", output_dir,
"--additional-properties", ",".join([
f"npmName={self.config.typescript_package_name}",
f"npmVersion={self.config.typescript_version}",
"supportsES6=true",
"typescriptThreePlus=true",
"withInterfaces=true",
"useSingleRequestParameter=true",
]),
"--type-mappings", "DateTime=string",
]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
logger.error(f"TypeScript generation failed:\n{result.stderr}")
raise RuntimeError("TypeScript SDK generation failed")
logger.info(f"TypeScript SDK generated at {output_dir}")
return output_dir
def generate_python_client(self) -> str:
"""Generate Python client SDK."""
output_dir = f"{self.config.output_base}/python"
shutil.rmtree(output_dir, ignore_errors=True)
cmd = [
"npx", f"@openapitools/openapi-generator-cli@{self.config.generator_version}",
"generate",
"-i", self.config.spec_path,
"-g", "python",
"-o", output_dir,
"--additional-properties", ",".join([
f"packageName={self.config.python_package_name}",
f"packageVersion={self.config.python_version}",
"generateSourceCodeOnly=false",
]),
]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
logger.error(f"Python generation failed:\n{result.stderr}")
raise RuntimeError("Python SDK generation failed")
logger.info(f"Python SDK generated at {output_dir}")
return output_dir
def generate_api_docs(self) -> str:
"""Generate interactive API documentation."""
output_file = f"{self.config.output_base}/docs/index.html"
Path(output_file).parent.mkdir(parents=True, exist_ok=True)
cmd = [
"npx", "redoc-cli", "build",
self.config.spec_path,
"--output", output_file,
"--options.theme.colors.primary.main=#ff9900",
"--options.theme.typography.fontFamily=Noto Sans JP, sans-serif",
]
result = subprocess.run(cmd, capture_output=True, text=True)
if result.returncode != 0:
logger.error(f"Doc generation failed:\n{result.stderr}")
raise RuntimeError("API doc generation failed")
logger.info(f"API docs generated at {output_file}")
return output_file
def run_contract_tests(self) -> dict:
"""
Run contract tests using Schemathesis to fuzz the API.
Validates that the live API matches the OpenAPI spec.
"""
result = subprocess.run(
[
"schemathesis", "run",
self.config.spec_path,
"--base-url", "https://api-staging.manga-assist.example.com/v2",
"--checks", "all",
"--max-examples", "50",
"--stateful", "links",
"--report",
],
capture_output=True,
text=True,
timeout=300,
)
return {
"passed": result.returncode == 0,
"output": result.stdout,
"errors": result.stderr if result.returncode != 0 else "",
}
def run_full_pipeline() -> dict:
"""Execute the full SDK generation pipeline."""
gen = SDKGenerator()
results = {}
# Step 1: Validate
if not gen.validate_spec():
return {"success": False, "stage": "validation"}
# Step 2: Check for breaking changes
breaking = gen.check_breaking_changes("api/manga-assist-api.previous.yaml")
results["breaking_changes"] = breaking
if breaking["breaking"]:
logger.warning("Breaking changes detected — manual review required")
# Don't block — flag for review
# Step 3: Generate SDKs
results["typescript_dir"] = gen.generate_typescript_client()
results["python_dir"] = gen.generate_python_client()
# Step 4: Generate docs
results["docs_path"] = gen.generate_api_docs()
# Step 5: Contract tests
results["contract_tests"] = gen.run_contract_tests()
results["success"] = True
return results
Prompt Flow Definitions
Flow Definition Schema
Bedrock Prompt Flows are defined as JSON documents that describe nodes, edges, and variable bindings. MangaAssist defines flows programmatically for version control.
graph LR
subgraph FlowDef["Flow Definition"]
NODES[Nodes<br/>Prompt, Condition,<br/>Lambda, Iterator]
EDGES[Edges<br/>Node connections]
VARS[Variables<br/>Input/Output bindings]
end
subgraph Lifecycle["Flow Lifecycle"]
DEF[Define in Code]
CREATE[Create via API]
VERSION[Version + Tag]
ALIAS[Create Alias]
DEPLOY[Deploy Canary]
MONITOR[Monitor Metrics]
PROMOTE[Promote or Rollback]
end
DEF --> CREATE --> VERSION --> ALIAS --> DEPLOY --> MONITOR --> PROMOTE
PROMOTE -->|Rollback| ALIAS
style DEPLOY fill:#ffc107,color:#000
style PROMOTE fill:#28a745,color:#fff
Prompt Flow Builder
"""
MangaAssist Prompt Flow Definition Builder
Programmatic construction of Bedrock Prompt Flows for version control.
"""
import json
import logging
from dataclasses import dataclass, field
from typing import Optional, Any
from enum import Enum
import boto3
from botocore.config import Config
logger = logging.getLogger(__name__)
class NodeType(Enum):
INPUT = "Input"
OUTPUT = "Output"
PROMPT = "Prompt"
CONDITION = "Condition"
LAMBDA = "LambdaFunction"
ITERATOR = "Iterator"
COLLECTOR = "Collector"
@dataclass
class PromptNodeConfig:
"""Configuration for a Prompt node."""
model_id: str
template: str
inference_config: dict = field(default_factory=lambda: {
"text": {
"temperature": 0.3,
"topP": 0.9,
"maxTokens": 1024,
}
})
@dataclass
class ConditionConfig:
"""Configuration for a Condition node."""
conditions: list[dict] = field(default_factory=list)
@dataclass
class FlowNode:
"""A node in a Prompt Flow."""
name: str
node_type: NodeType
config: Any = None
inputs: list[dict] = field(default_factory=list)
outputs: list[dict] = field(default_factory=list)
class PromptFlowBuilder:
"""
Builder for constructing MangaAssist Prompt Flow definitions.
Produces JSON that can be submitted to the Bedrock Agent API.
"""
def __init__(self, flow_name: str, description: str = ""):
self.flow_name = flow_name
self.description = description
self.nodes: list[FlowNode] = []
self.connections: list[dict] = []
def add_input_node(self, name: str = "FlowInput") -> "PromptFlowBuilder":
"""Add the flow input node."""
self.nodes.append(FlowNode(
name=name,
node_type=NodeType.INPUT,
outputs=[{"name": "document", "type": "Object"}],
))
return self
def add_output_node(self, name: str = "FlowOutput") -> "PromptFlowBuilder":
"""Add the flow output node."""
self.nodes.append(FlowNode(
name=name,
node_type=NodeType.OUTPUT,
inputs=[{"name": "document", "type": "Object"}],
))
return self
def add_prompt_node(
self,
name: str,
model_id: str,
template: str,
temperature: float = 0.3,
max_tokens: int = 1024,
) -> "PromptFlowBuilder":
"""Add a Prompt node for LLM invocation."""
config = PromptNodeConfig(
model_id=model_id,
template=template,
inference_config={
"text": {
"temperature": temperature,
"topP": 0.9,
"maxTokens": max_tokens,
}
},
)
self.nodes.append(FlowNode(
name=name,
node_type=NodeType.PROMPT,
config=config,
inputs=[{"name": "prompt_input", "type": "String"}],
outputs=[{"name": "model_output", "type": "String"}],
))
return self
def add_condition_node(
self, name: str, conditions: list[dict]
) -> "PromptFlowBuilder":
"""
Add a Condition node for branching.
conditions: [
{"name": "is_recommendation", "expression": "$.intent == 'recommendation'"},
{"name": "is_faq", "expression": "$.intent == 'faq'"},
]
"""
self.nodes.append(FlowNode(
name=name,
node_type=NodeType.CONDITION,
config=ConditionConfig(conditions=conditions),
inputs=[{"name": "condition_input", "type": "Object"}],
outputs=[{"name": c["name"], "type": "Object"} for c in conditions]
+ [{"name": "default", "type": "Object"}],
))
return self
def add_lambda_node(
self, name: str, function_arn: str
) -> "PromptFlowBuilder":
"""Add a Lambda function node."""
self.nodes.append(FlowNode(
name=name,
node_type=NodeType.LAMBDA,
config={"lambdaArn": function_arn},
inputs=[{"name": "lambda_input", "type": "Object"}],
outputs=[{"name": "lambda_output", "type": "Object"}],
))
return self
def connect(
self,
source_node: str,
source_output: str,
target_node: str,
target_input: str,
) -> "PromptFlowBuilder":
"""Connect two nodes."""
self.connections.append({
"source": source_node,
"sourceOutput": source_output,
"target": target_node,
"targetInput": target_input,
})
return self
def build(self) -> dict:
"""Build the complete flow definition."""
nodes = []
for node in self.nodes:
node_def = {
"name": node.name,
"type": node.node_type.value,
"inputs": node.inputs,
"outputs": node.outputs,
}
if node.config:
if node.node_type == NodeType.PROMPT:
node_def["configuration"] = {
"prompt": {
"sourceConfiguration": {
"inline": {
"modelId": node.config.model_id,
"templateType": "TEXT",
"templateConfiguration": {
"text": {
"text": node.config.template,
}
},
"inferenceConfiguration": node.config.inference_config,
}
}
}
}
elif node.node_type == NodeType.CONDITION:
node_def["configuration"] = {
"condition": {
"conditions": node.config.conditions,
}
}
elif node.node_type == NodeType.LAMBDA:
node_def["configuration"] = {
"lambdaFunction": node.config,
}
nodes.append(node_def)
return {
"name": self.flow_name,
"description": self.description,
"nodes": nodes,
"connections": self.connections,
}
def build_genre_classifier_flow() -> dict:
"""Build the MangaAssist genre classifier Prompt Flow."""
template = """あなたはマンガジャンル分類の専門家です。
ユーザーのメッセージを分析して、意図とジャンルを判定してください。
ユーザーメッセージ: {{user_message}}
会話履歴: {{conversation_context}}
以下のJSON形式で回答してください:
{
"intent": "recommendation" | "faq" | "general",
"genre": "shonen" | "shoujo" | "seinen" | "josei" | "isekai" | "horror" | "sports" | "unknown",
"confidence": 0.0-1.0,
"keywords": ["detected", "keywords"]
}"""
builder = PromptFlowBuilder(
flow_name="MangaAssist-GenreClassifier",
description="Classifies user intent and manga genre preference",
)
flow = (
builder
.add_input_node()
.add_prompt_node(
name="ClassifyIntent",
model_id="anthropic.claude-3-haiku-20240307-v1:0",
template=template,
temperature=0.1,
max_tokens=200,
)
.add_condition_node(
name="RouteByIntent",
conditions=[
{
"name": "is_recommendation",
"expression": "$.intent == 'recommendation'",
},
{
"name": "is_faq",
"expression": "$.intent == 'faq'",
},
],
)
.add_output_node()
.connect("FlowInput", "document", "ClassifyIntent", "prompt_input")
.connect("ClassifyIntent", "model_output", "RouteByIntent", "condition_input")
.connect("RouteByIntent", "is_recommendation", "FlowOutput", "document")
.connect("RouteByIntent", "is_faq", "FlowOutput", "document")
.connect("RouteByIntent", "default", "FlowOutput", "document")
.build()
)
return flow
def build_recommendation_flow() -> dict:
"""Build the MangaAssist recommendation Prompt Flow."""
template = """あなたはMangaAssistのマンガ推薦エキスパートです。
ジャンル: {{genre}}
ユーザーの質問: {{user_message}}
参考情報:
{{rag_context}}
会話履歴:
{{conversation_context}}
ルール:
- 3-5冊のマンガを推薦してください
- 各推薦には理由を添えてください
- ユーザーの好みに合わせた推薦をしてください
- 在庫がある作品を優先してください
- 丁寧な日本語で回答してください"""
builder = PromptFlowBuilder(
flow_name="MangaAssist-Recommendation",
description="Generates personalized manga recommendations",
)
flow = (
builder
.add_input_node()
.add_prompt_node(
name="GenerateRecommendation",
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
template=template,
temperature=0.3,
max_tokens=1024,
)
.add_prompt_node(
name="FormatForDisplay",
model_id="anthropic.claude-3-haiku-20240307-v1:0",
template=(
"以下の推薦文を読みやすく整形してください。"
"マークダウン形式で、絵文字は使わないでください。\n\n{{recommendation}}"
),
temperature=0.1,
max_tokens=1024,
)
.add_output_node()
.connect("FlowInput", "document", "GenerateRecommendation", "prompt_input")
.connect("GenerateRecommendation", "model_output", "FormatForDisplay", "prompt_input")
.connect("FormatForDisplay", "model_output", "FlowOutput", "document")
.build()
)
return flow
def deploy_flow(flow_definition: dict) -> dict:
"""Deploy a Prompt Flow to Bedrock."""
client = boto3.client(
"bedrock-agent",
config=Config(region_name="ap-northeast-1"),
)
# Create or update the flow
try:
response = client.create_flow(
name=flow_definition["name"],
description=flow_definition.get("description", ""),
definition={
"nodes": flow_definition["nodes"],
"connections": flow_definition["connections"],
},
executionRoleArn="arn:aws:iam::123456789012:role/MangaAssistPromptFlowRole",
)
flow_id = response["id"]
logger.info(f"Flow created: {flow_id}")
# Prepare and create a version
client.prepare_flow(flowIdentifier=flow_id)
version_response = client.create_flow_version(
flowIdentifier=flow_id,
description=f"Automated deployment at {__import__('time').strftime('%Y-%m-%d %H:%M')}",
)
version = version_response["version"]
logger.info(f"Flow version created: v{version}")
return {
"flow_id": flow_id,
"version": version,
"status": "deployed",
}
except Exception as e:
logger.error(f"Flow deployment failed: {e}")
raise
Amplify Chat Widget Embed Pattern
graph TB
subgraph PartnerSite["Partner Website"]
IFRAME[iframe embed<br/>sandboxed]
SCRIPT[Script embed<br/>window.MangaAssist.init]
SDK[SDK Integration<br/>npm package]
end
subgraph Widget["Chat Widget"]
BUBBLE[Chat Bubble<br/>Fixed position]
PANEL[Chat Panel<br/>Slide-in drawer]
AUTH[Auth Handler<br/>Cognito token]
end
subgraph Backend["MangaAssist Backend"]
APIGW[API Gateway]
FARGATE[ECS Fargate]
end
IFRAME --> BUBBLE
SCRIPT --> BUBBLE
SDK --> BUBBLE
BUBBLE --> PANEL
PANEL --> AUTH
AUTH --> APIGW
APIGW --> FARGATE
style BUBBLE fill:#61dafb,color:#000
style PANEL fill:#61dafb,color:#000
Key Takeaways
| # | Takeaway | MangaAssist Application |
|---|---|---|
| 1 | Amplify build specs drive reproducible deployments — The amplify.yml file defines the exact build, test, and deploy pipeline with per-branch environment configuration. |
Production, staging, and PR preview environments use the same build spec with different environment variables for API endpoints and Cognito pools. |
| 2 | OpenAPI code generation eliminates SDK drift — Auto-generated TypeScript and Python SDKs stay synchronized with the API spec, catching incompatibilities at build time. | Partner integrations receive a new SDK version within minutes of an API change; breaking changes are flagged by oasdiff before merge. |
| 3 | Contract testing validates live API compliance — Schemathesis fuzzes the staging API against the OpenAPI spec, catching violations that unit tests miss. | 50 auto-generated test cases per endpoint catch edge cases like malformed Japanese characters in query parameters. |
| 4 | Prompt Flows as code enables version control — Defining flows programmatically (not just in the visual builder) allows Git-tracked changes, code review, and automated deployment. | Genre classifier prompt changes go through pull request review before the flow is deployed to production. |
| 5 | Security headers are non-negotiable — CSP, HSTS, X-Frame-Options, and Permissions-Policy headers protect against XSS, clickjacking, and data exfiltration. | The Amplify custom headers config blocks all iframes and restricts connections to MangaAssist-owned domains only. |
| 6 | Widget embeds need multiple integration options — iframe (safest), script tag (flexible), and npm package (deepest) cover partner integration requirements from simple to sophisticated. | Small manga blog partners use the iframe embed; major retail sites use the npm package for deep UI integration. |
| 7 | Canary deployments for Prompt Flows reduce risk — 10% traffic to a new prompt version catches quality regressions before full rollout. | A recommendation prompt that inadvertently favors a single publisher was caught during canary and rolled back in under 5 minutes. |