Autocode Setup | Node | Web | Python (alpha) | Ruby (alpha)
Autocode is a fastest and easy way to build web services and APIs that respond to external SaaS events. The Autocode ecosystem treats external SaaS APIs as single-line function calls with the use of the lib package in NPM. The Autocode CLI allows you to interact seamlessly with the following components of Autocode:
Autocode is based on Function as a Service ("serverless") architecture, initially popularized by AWS Lambda. You can use Autocode to build modular, scalable APIs for yourself and other developers in minutes without having to manage servers, gateways, domains, write documentation, or build SDKs. Your development workflow has never been easier - focus on writing code you love, let Autocode handle everything else.
Autocode uses an open specification called FunctionScript for function definitions and execution. If you run into concerns or questions as you're building from this guide, please reference the FunctionScript repository. :)
You can view services published by our large and growing developer community on the Autocode standard library page.
To get started with Autocode, first make sure you have Node 8.x or later installed, available from the official Node.js website. Next install the Autocode CLI tools with:
$ npm install lib.cli -g
And you're now ready to start building!
The first thing you'll want to do is create a workspace. Create a new directory you intend to build your services in and initialize the workspace.
$ mkdir autocode-workspace
$ cd autocode-workspace
$ lib init
You'll be asked for an e-mail address to log in to the Autocode registry.
If you don't yet have an account, you can create one by going to https://autocode.com/.
Note that you can skip account creation with lib init --no-login
.
You'll be unable to use the registry, but it's useful for creating workspaces
when you don't have internet access.
Next, create your service:
$ lib create <service>
You'll be asked for a default function name, which is the entry point
into your service (useful if you only want a single entry point). This will automatically
generate a service project scaffold in autocode-workspace/<username>/<service>
.
Once created, enter the service directory:
$ cd your_username/your_service
In this directory, you'll see something like:
- functions/
- __main__.js
- package.json
- env.json
- WELCOME.md
- README.md
At this point, there's a "hello world" function that's been automatically
created (__main__.js
). Autocode comes paired with a simple lib
command for
testing your functions locally and running them in the cloud.
To test your function:
$ lib . "hello world"
If we examine the functions/__main__.js
file, we see the following:
/** * A basic Hello World function * @param {string} name Who you're saying hello to * @returns {string} */ module.exports = async (name = 'world', context) => { return `hello ${name}`; };
We can pass parameters to it using the CLI by specifying named parameters:
$ lib . --name "dolores abernathy" "hello dolores abernathy"
Note that context
is a magic parameter (automatically populated with
execution details, when provided) as is callback
(terminates execution),
so these don't need to be documented and can not be specified as
parameters when executing the function.
To push your function to a development environment in the cloud...
$ lib up dev $ lib your_username.your_service[@dev] "hello world"
And to release it (when you're ready!)
$ lib release $ lib your_username.your_service "hello world"
You can check out your service on the web, and use it in applications using our
functions gateway, api.stdlib.com
.
https://your_username.api.stdlib.com/your_service/
That's it! You haven't written a line of code yet, and you have mastery over building a service, testing it in a development (staging) environment online, and releasing it for private (or public) consumption.
Note: By default, APIs that you publish with lib release
will have a
documentation page in the Autocode public registry. You can keep your page private,
as well as restrict execution access or add collaborators to your API,
by modifying your API's permissions. For more information, see this docs page.
Another Note: Staging environments (like the one created with lib up dev
)
are mutable and can be replaced indefinitely. Releases (lib release
) are
immutable and can never be overwritten. However, any service can be torn down
with lib down <environment>
or lib down -r <version>
(but releases
can't be replaced once removed, to prevent mistakes and / or bad actors).
You'll notice that you can create more than one function per service. While
you can structure your project however you'd like internally, it should also
be noted that these functions have zero-latency access to each other. You
can access them internally with the lib
package on NPM,
which behaves similarly to the lib
command for testing. Use:
$ npm install lib --save
In your main service directory to add it, and use it like so:
module.exports = async (a = 0, b = 0) => { return a + b; };
const lib = require('lib'); module.exports = async (a = 0, b = 0, context) => { let result = await lib[`${context.service.identifier}.add`]({a: a, b: b}); return result * 2; };
In this case, calling lib .add --a 1 --b 2
will return 3
and lib .add_double --a 1 --b 2
will return 6
. The context
magic parameter is used for its
context.service.identifier
property, which will return the string "your_username.your_service[@local]"
in the case of local execution, "your_username.your_service[@ENV]"
when deployed to an
environment or release (where ENV
is your environment name or semver).
As mentioned in the previous section, you can use the NPM lib
package that's
available on GitHub and NPM to access your
APIs from legacy Node.js applications and even the web browser. We'll
have more SDKs coming out in the following months.
An existing app would call a function (username.bestTrekChar with version 0.2.1):
const lib = require('lib'); let result; try { result = await lib.username.bestTrekChar['@0.2.1']({name: 'spock'}); } catch (err) { // handle error } // do something with result
Which would speak to your API...
module.exports = async (name = 'kirk') => { if (name === 'kirk') { return 'why, thank, you, too, kind'; } else if (name === 'spock') { return 'i think this feeling is called "pleased"'; } else { throw new Error('Only kirk and spock supported.'); } };
We definitely recommend using the lib library on NPM to make API calls as specified above, but you can also make HTTPS requests directly to the Autocode gateway. HTTP query parameters are mapped automatically to parameters by name.
https://username.api.stdlib.com/liveService@1.12.2/?name=BATMAN
Maps directly to:
/** * Hello World * @param {string} name * @returns {string} */ module.exports = async (name = 'world') => { // returns "HELLO BATMAN" from above HTTP query return `Hello ${name}`; };
A quick note on version control - Autocode is not a replacement for normal git-based workflows, it is a supplement focused around service creation and execution.
You have unlimited access to any release (that hasn't been torn down)
with lib download <serviceIdentifier>
to download and unpack the
tarball to a working directory.
Tarballs (and package contents) are closed-source. Nobody but you (and potentially your teammates) has access to these. It's up to you whether or not you share the guts of your service with others on GitHub or NPM.
As mentioned above: releases are immutable and can not be overwritten (but can be removed, just not replaced afterwards) and development / staging environments are mutable, you can overwrite them as much as you'd like.
Logging for services is enabled by default. When running a service locally with
lib .
or lib .functionname
, all logs will be output in your console. The very
last output (normally a JSON-compatible string) is the return value of the function.
To view remote logs (in dev or release environments), use the following syntax:
:: Lists all logs for the service $ lib logs username.servicename :: Lists main service endpoint logs for "dev" environment $ lib logs username.servicename[@dev] :: Lists service endpoint named "test" logs for "dev" environment $ lib logs username.servicename[@dev].test :: Lists all logs for "dev" environment $ lib logs username.servicename[@dev]* $ lib logs username.servicename[@dev].*
The default log type is stdout
, though you can specify stderr
with
lib logs username.servicename -t stderr
.
Limit the number of lines to show with the -l
argument (or --lines
).
Autocode comes packed with a bunch of other goodies - as we roll out updates to
the platform the serverless builds we're using may change. You can update
your service to our latest build using lib rebuild
. If for any reason your
service goes down and is unrecoverable, you can fix it with this command.
To see a full list of commands available for the CLI tools, type:
$ lib help
We've conveniently copy-and-pasted the output here for you to peruse;
*
-b Execute as a Background Function
-d Specify debug mode (prints Gateway logs locally, response logs remotely)
-i Specify information mode (prints tar packing and execution request progress)
-t Specify an Identity Token to use manually
-x Unauthenticated - Execute without a token (overrides active token and -t flag)
--* all verbose flags converted to named keyword parameters
Runs an Autocode function, i.e. "lib user.service[@env]" (remote) or "lib ." (local)
create [service]
-n No login - don't require an internet connection
-w Write over - overwrite the current directory contents
--no-login No login - don't require an internet connection
--write-over Write over - overwrite the current directory contents
Creates a new (local) service
down [environment]
-r Remove a release version (provide number)
--release Remove a release version (provide number)
Removes Autocode package from registry and cloud environment
download [username/name OR username/name@env OR username/name@version]
-w Write over - overwrite the target directory contents
--write-over Write over - overwrite the target directory contents
Retrieves and extracts Autocode package
endpoints:create [name] [description] [param_1] [param_2] [...] [param_n]
-n New directory: Create as a __main__.js file, with the name representing the directory
--new New directory: Create as a __main__.js file, with the name representing the directory
Creates a new endpoint for a service
hostnames:add [source] [target]
Adds a new hostname route from a source custom hostname to a target service you own.
Accepts wildcards wrapped in curly braces ("{}") or "*" at the front of the hostname.
hostnames:list
Displays created hostname routes from source custom hostnames to target services you own
hostnames:remove
Removes a hostname route from a source custom hostname to a target service you own
http
-p Port (default 8170)
--port Port (default 8170)
Creates HTTP Server for Current Service
init [environment]
-f Force command to overwrite existing workspace
-n No login - don't require an internet connection
--force Force command to overwrite existing workspace
--no-login No login - don't require an internet connection
Initializes Autocode workspace
login
--email E-mail
--password Password
Logs in to Autocode
logout
-f Force - clears information even if current Access Token invalid
--force Force - clears information even if current Access Token invalid
Logs out of Autocode in this workspace
logs [service]
-l The number of log lines you want to retrieve
-t The log type you want to retrieve. Allowed values are "stdout" and "stderr".
--lines The number of log lines you want to retrieve
--type The log type you want to retrieve. Allowed values are "stdout" and "stderr".
Retrieves logs for a given service
rebuild [environment]
-r Rebuild a release package
--release Rebuild a release package
Rebuilds a service (useful for registry performance updates), alias of `lib restart -b`
release
Pushes release of Autocode package to registry and cloud (Alias of `lib up -r`)
tokens
Selects an active Identity Token for API Authentication
tokens:add-to-env
Sets STDLIB_SECRET_TOKEN in env.json "local" field to the value of an existing token
tokens:list
-a All - show invalidated tokens as well
-s Silent mode - do not display information
--all All - show invalidated tokens as well
--silent Silent mode - do not display information
Lists your remotely generated Identity Tokens (Authentication)
up [environment]
-f Force deploy
-r Upload a release package
--force Force deploy
--release Upload a release package
Pushes Autocode package to registry and cloud environment
user
-s <key> <value> Sets a specified key-value pair
--new-password Sets a new password via a prompt
--reset-password <email> Sends a password reset request for the specified e-mail address
--set <key> <value> Sets a specified key-value pair
Retrieves (and sets) current user information
version
Returns currently installed version of Autocode command line tools
If you're running a previous version and are having issues with the CLI, try cleaning up the old CLI binary links first;
$ rm /usr/local/bin/f
$ rm /usr/local/bin/lib
$ rm /usr/local/bin/stdlib
Yep, it's really that easy. To keep up-to-date on developments, please star us here on GitHub, and sign up a user account for the registry. You can read more about service hosting and keep track of official updates on the official Autocode website, autocode.com.
Autocode is a product of and © 2021 Polybit Inc.
We'd love for you to pay attention to @AutocodeHQ and what we're building next! If you'd consider joining the team, shoot us an e-mail.
You can also follow our team on Twitter:
Issues encouraged,
字节跳动发布的AI编程神器IDE
Trae是一种自适应的集成开发环境(IDE),通过自动化和多元协作改变开发流程。利用Trae,团队能够更快速、精确地编写和部署代码,从而提高编程效率和项目交付速度。Trae具备上下文感知和代码自动完成功能,是提升开发效率的理想工具。
AI小说写作助手,一站式润色、改写、扩写
蛙蛙写作—国内先进的AI写作平台,涵盖小说、学术、社交媒体等多场景。提供续写、改写、润色等功能,助力创作者高效优化写作流程。界面简洁,功能全面,适合各类写作者提升内容品质和工作效率。
全能AI智能助手,随时解答生活与工作的多样问题
问小白,由元石科技研发的AI智能助手,快速准确地解答各种生活和工作问题,包括但不限于搜索、规划和社交互动,帮助用户在日常生活中提高效率,轻松管理个人事务。
实时语音翻译/同声传译工具
Transly是一个多场景的AI大语言模型驱动的同声传译、专业翻译助手,它拥有超精准的音频识别翻译能力,几乎零延迟的使用体验和支持多国语言可以让你带它走遍全球,无论你是留学生、商务人士、韩剧美剧爱好者,还是出国游玩、多国会议、跨国追星等等,都可以满足你所有需要同传的场景需求,线上线下通用,扫除语言障碍,让全世界的语言交流不再有国界。
一键生成PPT和Word,让学习生活更轻松
讯飞智文是一个利用 AI 技术的项目,能够帮助用户生成 PPT 以及各类文档。无论是商业领域的市场分析报告、年度目标制定,还是学生群体的职业生涯规划、实习避坑指南,亦或是活动策划、旅游攻略等内容,它都能提供支持,帮助用户精准表达,轻松呈现各种信息。
深度推理能力全新升级,全面对标OpenAI o1
科大讯飞的星火大模型,支持语言理解、知识问答和文本创作等多功能,适用于多种文件和业务场景,提升办公和日常生活的效率。讯飞星火是一个提供丰富智能服务的平台,涵盖科技资讯、图像创作、写作辅助、编程解答、科研文献解读等功能,能为不同需求的用户提供便捷高效的帮助,助力用户轻松获取信息、解决问题,满足多样化使用场景。
一种基于大语言模型的高效单流解耦语音令牌文本到语音合成模型
Spark-TTS 是一个基于 PyTorch 的开源文本到语音合成项目,由多个知名机构联合参与。该项目提供了高效的 LLM(大语言模型)驱动的语音合成方案,支持语音克隆和语音创建功能,可通过命令行界面(CLI)和 Web UI 两种方式使用。用户可以根据需求调整语音的性别、音高、速度等参数,生成高质量的语音。该项目适用于多种场景,如有声读物制作、智能语音助手开发等。
AI助力,做PPT更简单!
咔片是一款轻量化在线演示设计工具,借助 AI 技术,实现从内容生成到智能设计的一站式 PPT 制作服务。支持多种文档格式导入生成 PPT,提供海量模板、智能美化、素材替换等功能,适用于销售、教师、学生等各类人群,能高效制作出高品质 PPT,满足不同场景演示需求。
选题、配图、成文,一站式创作,让内容运营更高效
讯飞绘文,一个AI集成平台,支持写作、选题、配图、排版和发布。高效生成适用于各类媒体的定制内容,加速品牌传播,提升内容营销效果。
专业的AI公文写作平台,公文写作神器
AI 材料星,专业的 AI 公文写作辅助平台,为体制内工作人员提供高效的公文写作解决方案。拥有海量公文文库、9 大核心 AI 功能,支持 30 + 文稿类型生成,助力快速完成领导讲话、工作总结、述职报告等材料,提升办公效率,是体制打工人的得力写作神器。
最新AI工具、AI资讯
独家AI资源、AI项目落地
微信扫一扫关注公众号