Skip to content

API - ARISEConfig

All configuration for an ARISE instance. Pass to ARISE(config=...) or use individual fields as defaults.

from arise import ARISEConfig
config = ARISEConfig(
model="gpt-4o-mini",
sandbox_backend="subprocess",
failure_threshold=5,
max_evolutions_per_hour=3,
verbose=True,
)
arise = ARISE(agent_fn=my_agent, reward_fn=reward_fn, config=config)
FieldTypeDefaultDescription
modelstr"gpt-4o-mini"LLM model for tool synthesis (not your agent’s model)
verboseboolTruePrint episode status and evolution progress
FieldTypeDefaultDescription
sandbox_backendstr"subprocess""subprocess" or "docker"
sandbox_timeoutint30Seconds before sandbox kills the process
FieldTypeDefaultDescription
failure_thresholdint5Consecutive failures before evolution triggers
plateau_windowint10Episodes to look back for plateau detection
plateau_min_improvementfloat0.05Minimum success rate improvement to avoid plateau trigger
max_evolutions_per_hourint3Rate limit for evolution cycles (cost control)
max_refinement_attemptsint3Max LLM retries to fix a failing skill
max_synthesis_workersint3Max concurrent tool synthesis threads
FieldTypeDefaultDescription
max_library_sizeint50Max number of active skills before synthesis stops
skill_store_pathstr"./arise_skills"Local SQLite skill library path
trajectory_store_pathstr"./arise_trajectories"Local SQLite trajectory store path
max_trajectoriesint1000Max trajectories to retain (older ones are pruned)
FieldTypeDefaultDescription
allowed_importslist[str] | NoneNoneWhitelist of importable modules. None = no restriction. Always set in production.
FieldTypeDefaultDescription
s3_bucketstr | NoneNoneS3 bucket for distributed skill store
s3_prefixstr"arise"S3 key prefix
sqs_queue_urlstr | NoneNoneSQS queue URL for trajectory reporting
aws_regionstr"us-east-1"AWS region
skill_cache_ttl_secondsint30How often to refresh skills from S3
FieldTypeDefaultDescription
registry_bucketstr | NoneNoneS3 bucket for the skill registry
registry_prefixstr"arise-registry"S3 key prefix for the registry
registry_check_before_synthesisboolTrueCheck registry before calling the LLM to synthesize
FieldTypeDefaultDescription
model_routesdict[str, str] | NoneNoneRoute specific synthesis tasks to different models
auto_select_modelboolFalseAuto-promote the model with the best synthesis track record
config = ARISEConfig(
model_routes={
"gap_detection": "gpt-4o-mini", # cheap for analysis
"synthesis": "claude-sonnet-4-5-20250929", # better code quality
"refinement": "gpt-4o-mini",
},
auto_select_model=True,
)
FieldTypeDefaultDescription
enable_telemetryboolFalseEnable OpenTelemetry spans for evolution steps (requires arise-ai[otel])

Development (minimal config):

config = ARISEConfig(
failure_threshold=2, # evolve quickly
verbose=True,
)

Production (locked down):

config = ARISEConfig(
model="gpt-4o-mini",
sandbox_backend="docker",
sandbox_timeout=30,
failure_threshold=5,
max_evolutions_per_hour=3,
max_library_size=50,
allowed_imports=["json", "re", "hashlib", "csv", "math", "base64", "datetime"],
verbose=False,
)

Distributed with registry:

config = ARISEConfig(
s3_bucket="arise-skills-prod",
sqs_queue_url="https://sqs.us-west-2.amazonaws.com/.../arise-trajectories",
aws_region="us-west-2",
registry_bucket="arise-registry-prod",
registry_check_before_synthesis=True,
model="gpt-4o-mini",
allowed_imports=["json", "re", "hashlib"],
)