Given a log-backend, set a new level for logs
backend required | string Enum: "console" "emit" "logstash" Defines the log backend, possible values depend on the current configuration |
level required | string Enum: "debug" "info" "notice" "warning" "error" "critical" "alert" The log level. |
{- "success": true,
- "message": "string",
- "error": "string"
}
Returns hand-picked config settings currently active.
{- "metrics": {
- "port": 1883,
- "host": "127.0.0.1",
- "enable": "false",
- "base_topic": "ttgw/sys/faxe/10_10_2_15/metrics"
}, - "debug_time_ms": 25000,
- "debug": {
- "port": 1883,
- "host": "127.0.0.1",
- "enable": "false",
- "base_topic": "ttgw/sys/faxe/10_10_2_15/debug"
}, - "conn_status": {
- "port": 1883,
- "host": "127.0.0.1",
- "enable": "true",
- "base_topic": "ttgw/sys/faxe/10_10_2_15/conn_status"
}
}
Returns an json list, containing information about each built-in node, usable for code editors.
[- {
- "name": "mem",
- "description": "Flow wide value storage. mem values are available to any other node (in lambda expressions) within a flow.",
- "parameters": [
- {
- "name": "field",
- "dataType": "string",
- "description": "field-path"
}, - {
- "name": "key",
- "dataType": "string",
- "description": "name of the value storage"
}, - {
- "name": "type",
- "dataType": "string",
- "description": "Type of mem storage, one of 'single', 'list' or 'set'",
- "defaultValue": "single"
}, - {
- "name": "default",
- "dataType": "string | number",
- "description": "Prefill the storage with this value"
}, - {
- "name": "default_json",
- "dataType": "is_set",
- "description": "When set, the default value will be interpreted as a json string",
- "defaultValue": false
}
]
}, - {
- "name": "tcp_send",
- "description": "This node connects to a tcp endpoint and sends data with a defined packet size.",
- "parameters": [
- {
- "name": "ip",
- "dataType": "string",
- "description": "ip or hostname for the tcp peer"
}, - {
- "name": "port",
- "dataType": "integer",
- "description": "port number"
}, - {
- "name": "packet",
- "dataType": "integer",
- "description": "packet length",
- "defaultValue": 2
}, - {
- "name": "every",
- "dataType": "string",
- "description": "send interval"
}, - {
- "name": "msg_text",
- "dataType": "string",
- "description": "predefined string to send to the peer endpoint"
}, - {
- "name": "msg_json",
- "dataType": "string",
- "description": "predefined json-string to send to the peer endpoint"
}, - {
- "name": "response_as",
- "dataType": "string",
- "description": "name of the field for parsed data"
}, - {
- "name": "response_json",
- "dataType": "is_set",
- "description": "interprets a response as a json-string",
- "defaultValue": false
}, - {
- "name": "response_timeout",
- "dataType": "duration ",
- "description": "timeout for a \"response\" after a message has been sent",
- "defaultValue": "5s"
}
]
}
]
Returns an json list, containing information about all built-in lambda functions. Probably for use in online editors.
[- {
- "name": "bool",
- "arguments": "a_value",
- "return": "true|false",
- "description": ""
}, - {
- "name": "int",
- "arguments": "value",
- "return": "integer",
- "description": ""
}, - {
- "name": "float",
- "arguments": "value",
- "return": "float",
- "description": ""
}, - {
- "name": "string",
- "arguments": "val",
- "return": "string",
- "description": ""
}, - {
- "name": "now",
- "arguments": "",
- "return": "integer",
- "description": "returns an utc timestamp in milliseconds"
}, - {
- "name": "dt_parse",
- "arguments": "ts, formatstring",
- "return": "integer",
- "description": "used to parse a datetime string to the internal format, see datetime-parsing for details"
}, - {
- "name": "to_iso8601",
- "arguments": "ts",
- "return": "string",
- "description": "converts the timestamp to an ISO8601 datetime string"
}, - {
- "name": "to_rfc3339",
- "arguments": "ts",
- "return": "string",
- "description": "converts the timestamp to an RFC3339 datetime string"
}
]
Returns statistics about the erlang VM faxe is running on.
{- "vmstats-vm_uptime": 3743605,
- "vmstats-run_queue": 1,
- "vmstats-reductions": 39523,
- "vmstats-proc_limit": 262144,
- "vmstats-proc_count": 182,
- "vmstats-port_limit": 65536,
- "vmstats-port_count": 9,
- "vmstats-modules": 1338,
- "vmstats-messages_in_queues": 0,
- "vmstats-memory.total": 86.81,
- "vmstats-memory.procs_used": 31.51,
- "vmstats-memory.ets": 2.52,
- "vmstats-memory.binary": 0.49,
- "vmstats-memory.atom_used": 1.43,
- "vmstats-io.bytes_out": 0,
- "vmstats-io.bytes_in": 0,
- "vmstats-gc.words_reclaimed": 261675,
- "vmstats-gc.count": 4,
- "vmstats-error_logger_queue_len": 0,
- "vmstats-atom_count": 44524
}
Returns top flow node processes sorted by reduction count.
[- {
- "node": "debug_test1-eval2",
- "reductions": 23280,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "node": "debug_test-eval2",
- "reductions": 21390,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "node": "debug_test1-json_emitter1",
- "reductions": 2445,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "node": "debug_test-json_emitter1",
- "reductions": 2320,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}
]
{- "running_temp_tasks": 0,
- "running_tasks": 0,
- "registered_templates": 4,
- "registered_tasks": 18,
- "permanent_tasks": 0,
- "faxe_version": "0.8.1",
- "data_throughput_sec": 11,
- "data_paths_known": 0
}
Returns top erlang processes sorted by reduction count.
[- {
- "registered_name": "code_server",
- "reductions": 445966,
- "message_queue_len": 0,
- "initial_call": [
- "erlang",
- "apply",
- 2
]
}, - {
- "reductions": 431797,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "registered_name": "faxe_stats",
- "reductions": 208607,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 131331,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 127561,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "registered_name": "erl_prim_loader",
- "reductions": 107626,
- "message_queue_len": 0,
- "initial_call": [
- "erlang",
- "apply",
- 2
]
}, - {
- "registered_name": "application_controller",
- "reductions": 72385,
- "message_queue_len": 0,
- "initial_call": [
- "erlang",
- "apply",
- 2
]
}
]
Returns top erlang processes sorted by message-queue-length.
[- {
- "reductions": 17776,
- "message_queue_len": 122,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 1387,
- "message_queue_len": 12,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 767,
- "message_queue_len": 3,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 1172,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}, - {
- "reductions": 19544,
- "message_queue_len": 0,
- "initial_call": [
- "proc_lib",
- "init_p",
- 5
]
}
]
Returns a (possibly filtered) list of all tasks.
orderby | string order the list of tasks by 'id', 'name', 'last_start', 'last_stop' or (default) 'changed' |
dir | string direction for ordering, 'asc' or (default) 'desc' |
full | string Default: false whether to return every task with the additional fields: "dfs", "group", "group_leader" |
[- {
- "id": 0,
- "name": "string",
- "dfs": "string",
- "tags": [
- "tag1",
- "tag2",
- "tag3"
], - "vars": {
- "template_vars": {
- "var1": "val1",
- "var2": "val2"
}
}, - "template": "string",
- "running": true,
- "permanent": true,
- "changed": "string",
- "last_start": "string",
- "last_stop": "string",
- "group_leader": true,
- "group": "string"
}
]
none
orderby | string order the list of tasks by 'id', 'name', 'last_start', 'last_stop' or (default) 'changed' |
dir | string direction for ordering, 'asc' or (default) 'desc' |
full | string Default: false whether to return every task with the additional fields: "dfs", "group", "group_leader" |
[- {
- "id": 0,
- "name": "string",
- "dfs": "string",
- "tags": [
- "tag1",
- "tag2",
- "tag3"
], - "vars": {
- "template_vars": {
- "var1": "val1",
- "var2": "val2"
}
}, - "template": "string",
- "running": true,
- "permanent": true,
- "changed": "string",
- "last_start": "string",
- "last_stop": "string",
- "group_leader": true,
- "group": "string"
}
]
Get a list of all tasks, that were created using the template with the given id
template_id required | string Id of the template |
orderby | string Default: "changed" order the list of tasks by 'id', 'name', 'last_start', 'last_stop' or (default) 'changed' |
dir | string Default: "desc" direction for ordering, 'asc' or (default) 'desc' |
full | string Default: false whether to return every task with the additional fields: "dfs", "group", "group_leader" |
[- {
- "id": 0,
- "name": "string",
- "dfs": "string",
- "tags": [
- "tag1",
- "tag2",
- "tag3"
], - "vars": {
- "template_vars": {
- "var1": "val1",
- "var2": "val2"
}
}, - "template": "string",
- "running": true,
- "permanent": true,
- "changed": "string",
- "last_start": "string",
- "last_stop": "string",
- "group_leader": true,
- "group": "string"
}
]
Stop a list of running tasks by their ids. Give a comma-separated list of task-ids.
ids required | string comma separated list of task-ids |
{- "success": true,
- "message": "stopped 1 flow"
}
Update all tasks by a specified template, running or not, so that updates on templates and/or macros get applied to these tasks at once.
template_id required | string The template_id |
{- "success": true,
- "message": "updated 7 flows"
}
Import a list of json encoded task definitions. Such a list can be retrieved with the '/tasks' method for example.
tasks required | string A json encoded list of tasks. |
{- "total": 0,
- "successful": 0,
- "errors": 0,
- "messages": [
- {
- "name": "error-message"
}
]
}
Register a new or update an existing task. POST or PUT methods possible.
name required | string Name for the new or existing task. |
dfs required | string A valid dfs script. |
tags | string A list of tags for the new task, provide json list(["tag1", "tag_2", "another_tag"]). |
{- "success": true,
- "name": "testflow",
- "id": 17
}
Get a task by its id
task_id required | string Id of the task to read |
{- "id": 0,
- "name": "string",
- "dfs": "string",
- "tags": [
- "tag1",
- "tag2",
- "tag3"
], - "vars": {
- "template_vars": {
- "var1": "val1",
- "var2": "val2"
}
}, - "template": "string",
- "running": true,
- "permanent": true,
- "changed": "string",
- "last_start": "string",
- "last_stop": "string",
- "group_leader": true,
- "group": "string"
}
task_id required | string Id of the task, that should be started |
quiet | boolean If set to true, will not complain about already started task. |
persistence | boolean Whether to start the flow with active state persistence. This defaults to the state persistence config setting |
{- "success": true
}
A permanently started task will automatically be started on faxe startup.
task_id required | string Id of the task, that should be started permanently |
quiet | boolean If set to true, will not complain about already started task. |
persistence | boolean Whether to start the flow with active state persistence. This defaults to the state persistence config setting |
{- "success": true
}
task_id required | string Id of the task, that should be stopped |
keepstate | boolean Whether to keep flow state when stopping the flow. By default, the flow state will be deleted on stop. |
{- "success": true
}
A permanently stopped task will NOT automatically be started on faxe startup anymore.
task_id required | string Id of the task, that should be stopped permanently |
keepstate | boolean Whether to keep flow state when stopping the flow. By default, the flow state will be deleted on stop. |
{- "success": true
}
Starts 'concurrency' copies of the task with id 'task_id'.
task_id required | string Id of the task, that should be started concurrently |
concurrency required | integer Number of instances, that should be started concurrently |
{- "success": true
}
Change the number of copies a task-group should run. This method is to be called on a running task or task-group.
groupname required | string Name of the task-group. |
group_size required | integer New number of copies to run for this task-group |
Provide a dfs script to validate it's syntax and contents. Since v0.20.0 the response contains a list of custom nodes and their dependencies written in python, that are used in the provided dfs.
dfs required | string A valid dfs script. |
{- "success": true,
- "message": "script for flow is valid",
- "python_nodes": [
- "custom_node_a",
- "custom_node_b",
- "module_used_in_node_b"
]
}
Provide a dfs script and a name for the new task
name required | string Name for the newly registered task. |
dfs required | string A valid dfs script. |
tags | string A list of tags for the new task, provide json list(["tag1", "tag_2", "another_tag"]). |
{- "success": true,
- "name": "testflow",
- "id": 17
}
Provide a dfs script and a name, update can be done while a task is running.
task_id required | string Id of the task, that should be updated. |
name | string Task name |
dfs required | string A valid dfs script. |
tags | string A list of tags for the task, provide json list(["tag1", "tag_2", "another_tag"]). |
{- "success": true,
- "name": "testflow",
- "id": 17
}
Registers a new task with name 'name' from the template with id 'template_id'.
template_id required | string |
task_name required | string |
tags | string A list of tags for the new task, provide json list(["tag1", "tag_2", "another_tag"]). |
vars | string A json object with vars for the new task, example: {"var1": 33}. Every definition (keyword 'def') in the template dfs can be overwritten. |
{- "success": true,
- "name": "testflow",
- "id": 17
}
The graph representation contains all nodes and edges, as well as some information about external connections a node may establish
task_id required | string |
{- "nodes": [
- {
- "type": "esp_debug",
- "name": "debug6",
- "group": 1,
- "display_name": "debug6"
}, - {
- "type": "c_python",
- "name": "@callback5",
- "group": 1,
- "display_name": "python_passthrough"
}, - {
- "type": "esp_batch",
- "name": "batch4",
- "group": 1,
- "display_name": "batch4"
}, - {
- "type": "c_python",
- "name": "@double3",
- "group": 1,
- "display_name": "python_double"
}, - {
- "type": "esp_default",
- "name": "default2",
- "group": 1,
- "display_name": "default2"
}, - {
- "type": "esp_value_emitter",
- "name": "value_emitter1",
- "group": 1,
- "display_name": "emitter"
}
], - "edges": [
- {
- "src_port": 1,
- "src": "@double3",
- "dest_port": 1,
- "dest": "batch4"
}, - {
- "src_port": 1,
- "src": "@callback5",
- "dest_port": 1,
- "dest": "debug6"
}, - {
- "src_port": 1,
- "src": "default2",
- "dest_port": 1,
- "dest": "@double3"
}, - {
- "src_port": 1,
- "src": "batch4",
- "dest_port": 1,
- "dest": "@callback5"
}, - {
- "src_port": 1,
- "src": "value_emitter1",
- "dest_port": 1,
- "dest": "default2"
}
]
}
Returns a list of all templates currently registered with this faxe instance.
orderby | string Default: "changed" order the list of templates by 'id', 'name' or (default) 'changed' |
dir | string Default: "desc" direction for ordering, 'asc' or (default) 'desc' |
Import a list of json encoded template definitions. Such a list can be retrieved with the '/templates' method for example.
templates required | string A json encoded list of templates. |
{- "total": 0,
- "successful": 0,
- "errors": 0,
- "messages": [
- {
- "name": "error-message"
}
]
}
Upload one or more python source code files, that will be placed in the python script folder defined in faxe's config.
{- "files": [
- {
- "uploaded": "filename.py",
- "stored": "/path/to/pythonfolder/filename.py"
}, - {
- "error": "filename.py",
- "message": "what went wrong"
}
]
}
Get info about python runtime and a list of known python modules.
{- "vsn": "Python 3.9.1",
- "path": "/data/python",
- "modules": [
- "script1.py",
- "script2.py",
- "script3.py"
], - "executable": "/usr/local/bin/python"
}
Start the given dfs script with a TTL, runtime can be extended with a call to /task/ping/:task_id when the timeout is over the task will stop and delete itself. Temporary tasks can not be restarted or updated!
dfs required | string DFS Script to run |
ttl required | integer Timeout in milliseconds |
{- "id": "string",
- "ttl": 0,
- "success": true
}