PocketFlow PHP: Minimalist LLM framework for PHP. Let Agents build Agents!
Language: PHP | License: MIT
PocketFlow PHP is the first PHP implementation of the minimalist LLM framework concept
-
Lightweight: ~400 lines of PHP. Zero bloat, pure PHP elegance.
-
Framework Agnostic: Works with any PHP project, not tied to specific frameworks.
-
Graph-Based: Simple node and flow abstraction for complex LLM workflows.
-
ReactPHP Ready: Optional async support for parallel processing.
Get started with PocketFlow PHP:
- Installation:
composer require projectsaturnstudios/pocketflow-php
- Quick Start: Copy the source files into your PHP project
- Documentation: Examples in this README and source code
- LLM Integration: Bring your own LLM client (OpenAI SDK, Guzzle, etc.)
The PHP ecosystem was missing a minimalist LLM workflow framework... until now!
Abstraction | PHP Integration | LLM Support | Lines | Dependencies | |
---|---|---|---|---|---|
LLPhant | Comprehensive | Framework agnostic (Symfony/Laravel compatible) |
Multiple providers (OpenAI, Anthropic, Mistral, etc.) |
~15K+ | Heavy (many providers) |
LangChain PHP | Agent, Chain | Basic (Work in progress) |
Limited (OpenAI, llama.cpp) |
~5K | Moderate |
PocketFlow PHP | Graph | Framework Agnostic (Pure PHP, works anywhere) |
Bring Your Own (Use any HTTP client) |
~400 | Minimal |
The core abstraction: Graph-based workflow execution with simple nodes and flows.
- BaseNode: Foundation class with
prep()
,exec()
,post()
lifecycle - Node: Extended with retry logic and fallback handling
- Flow: Orchestrates node execution with action-based routing
- BatchNode/BatchFlow: Process arrays of data through workflows
- AsyncNode/AsyncFlow: ReactPHP-powered parallel execution (optional)
- Reference Passing: Proper
&$shared
parameter handling for state persistence - Type Safety: Full PHP 8.1+ type declarations
- Error Handling: Comprehensive exception handling with fallbacks
- Memory Management: Configurable data retention
<?php
require 'vendor/autoload.php';
use ProjectSaturnStudios\PocketFlowPHP\{Node, Flow};
class HelloNode extends Node
{
public function exec(mixed $prep_res): mixed
{
return "Hello, " . ($prep_res['name'] ?? 'World') . "!";
}
}
class OutputNode extends Node
{
public function prep(mixed &$shared): mixed
{
return $shared; // Pass through shared data
}
public function exec(mixed $prep_res): mixed
{
echo $prep_res['greeting'] ?? 'No greeting found';
return 'done';
}
public function post(mixed &$shared, mixed $prep_res, mixed $exec_res): mixed
{
return $exec_res;
}
}
// Create nodes
$helloNode = new HelloNode();
$outputNode = new OutputNode();
// Chain them
$helloNode->next($outputNode, 'success');
// Create flow and run
$flow = new Flow($helloNode);
$shared = ['name' => 'PocketFlow'];
$result = $flow->_run($shared);
<?php
// Bring your own LLM client
use OpenAI\Client as OpenAIClient;
class LLMNode extends Node
{
public function __construct(private OpenAIClient $client) {}
public function prep(mixed &$shared): mixed
{
return ['prompt' => $shared['prompt'] ?? 'Say hello!'];
}
public function exec(mixed $prep_res): mixed
{
$response = $this->client->chat()->create([
'model' => 'gpt-3.5-turbo',
'messages' => [
['role' => 'user', 'content' => $prep_res['prompt']]
]
]);
return $response->choices[0]->message->content;
}
public function post(mixed &$shared, mixed $prep_res, mixed $exec_res): mixed
{
$shared['llm_response'] = $exec_res;
return 'success';
}
}
// Usage
$client = OpenAI::client('your-api-key');
$llmNode = new LLMNode($client);
$outputNode = new OutputNode();
$llmNode->next($outputNode, 'success');
$flow = new Flow($llmNode);
$shared = ['prompt' => 'Write a haiku about PHP'];
$flow->_run($shared);
<?php
class ChatNode extends Node
{
public function __construct(private $llmClient) {}
public function prep(mixed &$shared): mixed
{
// Get user input
echo "You: ";
$input = trim(fgets(STDIN));
if ($input === 'exit') {
return ['action' => 'exit'];
}
$shared['messages'][] = ['role' => 'user', 'content' => $input];
return ['messages' => $shared['messages']];
}
public function exec(mixed $prep_res): mixed
{
if ($prep_res['action'] === 'exit') {
return 'exit';
}
// Call your LLM here
$response = $this->llmClient->chat($prep_res['messages']);
return $response;
}
public function post(mixed &$shared, mixed $prep_res, mixed $exec_res): mixed
{
if ($exec_res === 'exit') {
echo "Goodbye!\n";
return 'exit';
}
echo "AI: $exec_res\n\n";
$shared['messages'][] = ['role' => 'assistant', 'content' => $exec_res];
return 'continue'; // Self-loop
}
}
// Create self-looping chat
$chatNode = new ChatNode($yourLLMClient);
$chatNode->next($chatNode, 'continue'); // Self-loop!
$flow = new Flow($chatNode);
$shared = ['messages' => []];
$flow->_run($shared);
$batchNode = new BatchNode();
$batchNode->setItems(['item1', 'item2', 'item3']);
$batchFlow = new BatchFlow($batchNode);
// composer require react/socket
use React\EventLoop\Loop;
$asyncNode = new AsyncNode();
$asyncFlow = new AsyncFlow($asyncNode);
// Parallel execution with promises
$nodeA->next($nodeB, 'success');
$nodeA->next($nodeC, 'error');
$nodeA->next($nodeD, 'retry');
Feature | Python PocketFlow | PHP PocketFlow | Notes |
---|---|---|---|
Core Abstraction | ✅ Graph | ✅ Graph | Same philosophy |
Async Support | ✅ asyncio | Different implementations | |
Framework Integration | ❌ None | ✅ Framework Agnostic | Works with any PHP project |
LLM Providers | ❌ Manual | ❌ Bring Your Own | Both require manual integration |
Type Safety | ✅ Full | PHP 8.1+ strict types | |
Lines of Code | 100 | ~400 | More features, still minimal |
- PHP 8.1+
- Composer
composer require projectsaturnstudios/pocketflow-php
# For async support
composer require react/socket
# For LLM integration (examples)
composer require openai-php/client
composer require guzzlehttp/guzzle
- Install Package:
composer require projectsaturnstudios/pocketflow-php
- Create Nodes: Extend
Node
orBaseNode
classes - Chain Workflows: Use
$node->next($nextNode, 'action')
- Run Flows:
$flow = new Flow($startNode); $flow->_run($shared);
Important: PocketFlow PHP is framework-agnostic and does not include LLM provider integrations. You need to:
- Choose Your LLM Client: OpenAI SDK, Guzzle HTTP, cURL, etc.
- Implement in Nodes: Add LLM calls in your
exec()
methods - Handle Responses: Process LLM responses in your
post()
methods - Manage State: Use
&$shared
parameters for conversation history
This approach gives you complete control over your LLM integrations without vendor lock-in.
Dependencies:
- ReactPHP: Required only for async features (optional)
- PHP 8.1+: Required for type safety and modern features
No Lock-ins:
- ❌ No specific LLM provider
- ❌ No specific HTTP client
- ❌ No specific framework
- ❌ No specific database
This is the world's first PHP implementation of PocketFlow! We welcome contributions:
- 🐛 Bug Reports: Found an issue? Let us know!
- 🚀 Feature Requests: Ideas for PHP-specific features?
- 📖 Documentation: Help improve our docs
- 🧪 Examples: Share your PocketFlow PHP workflows
- Core Framework: Basic node and flow implementation
- Async Support: ReactPHP integration
- Batch Processing: Array and parallel processing
- More Examples: Real-world workflow patterns
- Performance: Optimize for large-scale applications
- Testing: Comprehensive test suite
- Documentation: Full API documentation
MIT License - same as original PocketFlow
- Original PocketFlow: The-Pocket/PocketFlow - The inspiration and foundation
- ReactPHP: For async capabilities in PHP (optional dependency)
- PHP Community: For the amazing language ecosystem
Built with ADHD by Project Saturn Studios