[FL-3162] Moved ufbt to fbt codebase (#2520)

* scripts: moved ufbt code
* ufbt: fixed tool path
* ufbt: fixed linter/formatter target descriptions
* scripts: ufbt: cleanup
* fbt: moved fap launch target to tools; ufbt fixes
* fbt: fixed missing headers from SDK
* ufbt: removed debug output
* ufbt: moved project template to main codebase
* ufbt: fixed vscode_dist
* ufbt: path naming changes
* fbt: error message for older ufbt versions
* ufbt: docs fixes
* ufbt: fixed build dir location
* fbt: fixes for extapps objcopy
* fbt: extapps: removed extra debug output; fixed formatting
* ufbt: handle launch target for multiple known apps
* ufbt: dropping wrapper; linter fixes
* ufbt: fixed boostrap path
* ufbt: renamed entrypoint
* ufbt: updated vscode config
* ufbt: moved sconsign db location
* ufbt: fixed sconsign path
* fbt: SDK builders rework
* fbt: reworked sdk packaging
* ufbt: additional checks and state processing
* ufbt: fixed sdk state file location
* dist: not packaging pycache
* dump commit json content
* Github: more workflow debug prints
* Github: fix incorrect commit meta extraction in get_env.py
* ufbt, fbt: changed SConsEnvironmentError->StopError
* fbtenv: no longer needs SCRIPT_PATH pre-set
* ufbt: fixed sdk state check
* scripts: exception fixes for storage.py
* scripts: fbtenv: added FBT_TOOLCHAIN_PATH for on Windows for compat
* ufbt: app template: creating .gitkeep for images folder
* ufbt: app template: fixed .gitkeep creation
* docs: formatting fixes for AppManifests; added link to ufbt
* fbt: added link to PyPI for old ufbt versions
* sdk: fixed dir component paths

Co-authored-by: Aleksandr Kutuzov <alleteam@gmail.com>
This commit is contained in:
hedger 2023-04-06 06:44:37 +04:00 committed by GitHub
parent 8a021ae48c
commit a91d319839
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
32 changed files with 1496 additions and 200 deletions

View File

@ -4,7 +4,7 @@
#include <flipper_application/api_hashtable/compilesort.hpp>
/* Generated table */
#include <symbols.h>
#include <firmware_api_table.h>
static_assert(!has_hash_collisions(elf_api_table), "Detected API method hash collision!");

View File

@ -75,12 +75,12 @@ Example for building an app from Rust sources:
Library sources must be placed in a subfolder of the `lib` folder within the application's source folder.
Each library is defined as a call to the `Lib()` function, accepting the following parameters:
- **name**: name of the library's folder. Required.
- **fap_include_paths**: list of the library's relative paths to add to the parent fap's include path list. The default value is `["."]`, meaning the library's source root.
- **sources**: list of filename masks to be used for gathering include files for this library. Paths are relative to the library's source root. The default value is `["*.c*"]`.
- **cflags**: list of additional compiler flags to be used for building this library. The default value is `[]`.
- **cdefines**: list of additional preprocessor definitions to be used for building this library. The default value is `[]`.
- **cincludes**: list of additional include paths to be used for building this library. Paths are relative to the application's root. This can be used for providing external search paths for this library's code — for configuration headers. The default value is `[]`.
- **name**: name of the library's folder. Required.
- **fap_include_paths**: list of the library's relative paths to add to the parent fap's include path list. The default value is `["."]`, meaning the library's source root.
- **sources**: list of filename masks to be used for gathering include files for this library. Paths are relative to the library's source root. The default value is `["*.c*"]`.
- **cflags**: list of additional compiler flags to be used for building this library. The default value is `[]`.
- **cdefines**: list of additional preprocessor definitions to be used for building this library. The default value is `[]`.
- **cincludes**: list of additional include paths to be used for building this library. Paths are relative to the application's root. This can be used for providing external search paths for this library's code — for configuration headers. The default value is `[]`.
Example for building an app with a private library:

View File

@ -3,6 +3,8 @@
FBT is the entry point for firmware-related commands and utilities.
It is invoked by `./fbt` in the firmware project root directory. Internally, it is a wrapper around [scons](https://scons.org/) build system.
If you don't need all features of `fbt` - like building the whole firmware - and only want to build and debug a single application, you can use [ufbt](https://pypi.org/project/ufbt/).
## Environment
To use `fbt`, you only need `git` installed in your system.

View File

@ -68,7 +68,7 @@ env = ENV.Clone(
],
},
},
SDK_APISYMS=None,
FW_API_TABLE=None,
_APP_ICONS=None,
)
@ -241,7 +241,7 @@ Depends(
[
fwenv["FW_VERSION_JSON"],
fwenv["FW_ASSETS_HEADERS"],
fwenv["SDK_APISYMS"],
fwenv["FW_API_TABLE"],
fwenv["_APP_ICONS"],
],
)

View File

@ -1,6 +1,6 @@
from SCons.Builder import Builder
from SCons.Action import Action
from SCons.Errors import SConsEnvironmentError
from SCons.Errors import StopError
import os
import subprocess
@ -90,7 +90,7 @@ def proto_ver_generator(target, source, env):
source_dir=src_dir,
)
except (subprocess.CalledProcessError, EnvironmentError) as e:
raise SConsEnvironmentError("Git: describe failed")
raise StopError("Git: describe failed")
git_major, git_minor = git_describe.split(".")
version_file_data = (

View File

@ -21,6 +21,10 @@ from fbt.sdk.cache import SdkCache
from fbt.util import extract_abs_dir_path
_FAP_META_SECTION = ".fapmeta"
_FAP_FILEASSETS_SECTION = ".fapassets"
@dataclass
class FlipperExternalAppInfo:
app: FlipperApplication
@ -234,6 +238,8 @@ def BuildAppElf(env, app):
def prepare_app_metadata(target, source, env):
metadata_node = next(filter(lambda t: t.name.endswith(_FAP_META_SECTION), target))
sdk_cache = SdkCache(env["SDK_DEFINITION"].path, load_version_only=True)
if not sdk_cache.is_buildable():
@ -242,8 +248,7 @@ def prepare_app_metadata(target, source, env):
)
app = env["APP"]
meta_file_name = source[0].path + ".meta"
with open(meta_file_name, "wb") as f:
with open(metadata_node.abspath, "wb") as f:
f.write(
assemble_manifest_data(
app_manifest=app,
@ -337,24 +342,26 @@ def embed_app_metadata_emitter(target, source, env):
if app.apptype == FlipperAppType.PLUGIN:
target[0].name = target[0].name.replace(".fap", ".fal")
meta_file_name = source[0].path + ".meta"
target.append("#" + meta_file_name)
target.append(env.File(source[0].abspath + _FAP_META_SECTION))
if app.fap_file_assets:
files_section = source[0].path + ".files.section"
target.append("#" + files_section)
target.append(env.File(source[0].abspath + _FAP_FILEASSETS_SECTION))
return (target, source)
def prepare_app_files(target, source, env):
files_section_node = next(
filter(lambda t: t.name.endswith(_FAP_FILEASSETS_SECTION), target)
)
app = env["APP"]
directory = app._appdir.Dir(app.fap_file_assets)
directory = env.Dir(app._apppath).Dir(app.fap_file_assets)
if not directory.exists():
raise UserError(f"File asset directory {directory} does not exist")
bundler = FileBundler(directory.abspath)
bundler.export(source[0].path + ".files.section")
bundler.export(files_section_node.abspath)
def generate_embed_app_metadata_actions(source, target, env, for_signature):
@ -367,15 +374,15 @@ def generate_embed_app_metadata_actions(source, target, env, for_signature):
objcopy_str = (
"${OBJCOPY} "
"--remove-section .ARM.attributes "
"--add-section .fapmeta=${SOURCE}.meta "
"--add-section ${_FAP_META_SECTION}=${SOURCE}${_FAP_META_SECTION} "
)
if app.fap_file_assets:
actions.append(Action(prepare_app_files, "$APPFILE_COMSTR"))
objcopy_str += "--add-section .fapassets=${SOURCE}.files.section "
objcopy_str += "--add-section ${_FAP_FILEASSETS_SECTION}=${SOURCE}${_FAP_FILEASSETS_SECTION} "
objcopy_str += (
"--set-section-flags .fapmeta=contents,noload,readonly,data "
"--set-section-flags ${_FAP_META_SECTION}=contents,noload,readonly,data "
"--strip-debug --strip-unneeded "
"--add-gnu-debuglink=${SOURCE} "
"${SOURCES} ${TARGET}"
@ -391,6 +398,51 @@ def generate_embed_app_metadata_actions(source, target, env, for_signature):
return Action(actions)
def AddAppLaunchTarget(env, appname, launch_target_name):
deploy_sources, flipp_dist_paths, validators = [], [], []
run_script_extra_ars = ""
def _add_dist_targets(app_artifacts):
validators.append(app_artifacts.validator)
for _, ext_path in app_artifacts.dist_entries:
deploy_sources.append(app_artifacts.compact)
flipp_dist_paths.append(f"/ext/{ext_path}")
return app_artifacts
def _add_host_app_to_targets(host_app):
artifacts_app_to_run = env["EXT_APPS"].get(host_app.appid, None)
_add_dist_targets(artifacts_app_to_run)
for plugin in host_app._plugins:
_add_dist_targets(env["EXT_APPS"].get(plugin.appid, None))
artifacts_app_to_run = env.GetExtAppByIdOrPath(appname)
if artifacts_app_to_run.app.apptype == FlipperAppType.PLUGIN:
# We deploy host app instead
host_app = env["APPMGR"].get(artifacts_app_to_run.app.requires[0])
if host_app:
if host_app.apptype == FlipperAppType.EXTERNAL:
_add_host_app_to_targets(host_app)
else:
# host app is a built-in app
run_script_extra_ars = f"-a {host_app.name}"
_add_dist_targets(artifacts_app_to_run)
else:
raise UserError("Host app is unknown")
else:
_add_host_app_to_targets(artifacts_app_to_run.app)
# print(deploy_sources, flipp_dist_paths)
env.PhonyTarget(
launch_target_name,
'${PYTHON3} "${APP_RUN_SCRIPT}" ${EXTRA_ARGS} -s ${SOURCES} -t ${FLIPPER_FILE_TARGETS}',
source=deploy_sources,
FLIPPER_FILE_TARGETS=flipp_dist_paths,
EXTRA_ARGS=run_script_extra_ars,
)
env.Alias(launch_target_name, validators)
def generate(env, **kw):
env.SetDefault(
EXT_APPS_WORK_DIR="${FBT_FAP_DEBUG_ELF_ROOT}",
@ -410,10 +462,14 @@ def generate(env, **kw):
EXT_APPS={}, # appid -> FlipperExternalAppInfo
EXT_LIBS={},
_APP_ICONS=[],
_FAP_META_SECTION=_FAP_META_SECTION,
_FAP_FILEASSETS_SECTION=_FAP_FILEASSETS_SECTION,
)
env.AddMethod(BuildAppElf)
env.AddMethod(GetExtAppByIdOrPath)
env.AddMethod(AddAppLaunchTarget)
env.Append(
BUILDERS={
"FapDist": Builder(

View File

@ -38,13 +38,13 @@ def ProcessSdkDepends(env, filename):
return depends
def prebuild_sdk_emitter(target, source, env):
def api_amalgam_emitter(target, source, env):
target.append(env.ChangeFileExtension(target[0], ".d"))
target.append(env.ChangeFileExtension(target[0], ".i.c"))
return target, source
def prebuild_sdk_create_origin_file(target, source, env):
def api_amalgam_gen_origin_header(target, source, env):
mega_file = env.subst("${TARGET}.c", target=target[0])
with open(mega_file, "wt") as sdk_c:
sdk_c.write(
@ -87,6 +87,7 @@ class SdkMeta:
class SdkTreeBuilder:
SDK_DIR_SUBST = "SDK_ROOT_DIR"
SDK_APP_EP_SUBST = "SDK_APP_EP_SUBST"
HEADER_EXTENSIONS = [".h", ".hpp"]
def __init__(self, env, target, source) -> None:
self.env = env
@ -111,7 +112,10 @@ class SdkTreeBuilder:
lines = LogicalLines(deps_f).readlines()
_, depends = lines[0].split(":", 1)
self.header_depends = list(
filter(lambda fname: fname.endswith(".h"), depends.split()),
filter(
lambda fname: any(map(fname.endswith, self.HEADER_EXTENSIONS)),
depends.split(),
),
)
self.header_depends.append(self.sdk_env.subst("${LINKER_SCRIPT_PATH}"))
self.header_depends.append(self.sdk_env.subst("${SDK_DEFINITION}"))
@ -180,12 +184,12 @@ class SdkTreeBuilder:
self._generate_sdk_meta()
def deploy_sdk_tree_action(target, source, env):
def deploy_sdk_header_tree_action(target, source, env):
sdk_tree = SdkTreeBuilder(env, target, source)
return sdk_tree.deploy_action()
def deploy_sdk_tree_emitter(target, source, env):
def deploy_sdk_header_tree_emitter(target, source, env):
sdk_tree = SdkTreeBuilder(env, target, source)
return sdk_tree.emitter(target, source, env)
@ -224,7 +228,7 @@ def _check_sdk_is_up2date(sdk_cache: SdkCache):
)
def validate_sdk_cache(source, target, env):
def validate_api_cache(source, target, env):
# print(f"Generating SDK for {source[0]} to {target[0]}")
current_sdk = SdkCollector()
current_sdk.process_source_file_for_sdk(source[0].path)
@ -237,7 +241,7 @@ def validate_sdk_cache(source, target, env):
_check_sdk_is_up2date(sdk_cache)
def generate_sdk_symbols(source, target, env):
def generate_api_table(source, target, env):
sdk_cache = SdkCache(source[0].path)
_check_sdk_is_up2date(sdk_cache)
@ -249,11 +253,11 @@ def generate_sdk_symbols(source, target, env):
def generate(env, **kw):
if not env["VERBOSE"]:
env.SetDefault(
SDK_PREGEN_COMSTR="\tPREGEN\t${TARGET}",
SDK_COMSTR="\tSDKSRC\t${TARGET}",
SDK_AMALGAMATE_HEADER_COMSTR="\tAPIPREP\t${TARGET}",
SDK_AMALGAMATE_PP_COMSTR="\tAPIPP\t${TARGET}",
SDKSYM_UPDATER_COMSTR="\tSDKCHK\t${TARGET}",
SDKSYM_GENERATOR_COMSTR="\tSDKSYM\t${TARGET}",
SDKDEPLOY_COMSTR="\tSDKTREE\t${TARGET}",
APITABLE_GENERATOR_COMSTR="\tAPITBL\t${TARGET}",
SDKTREE_COMSTR="\tSDKTREE\t${TARGET}",
)
# Filtering out things cxxheaderparser cannot handle
@ -274,40 +278,40 @@ def generate(env, **kw):
env.AddMethod(ProcessSdkDepends)
env.Append(
BUILDERS={
"SDKPrebuilder": Builder(
emitter=prebuild_sdk_emitter,
"ApiAmalgamator": Builder(
emitter=api_amalgam_emitter,
action=[
Action(
prebuild_sdk_create_origin_file,
"$SDK_PREGEN_COMSTR",
api_amalgam_gen_origin_header,
"$SDK_AMALGAMATE_HEADER_COMSTR",
),
Action(
"$CC -o $TARGET -E -P $CCFLAGS $_CCCOMCOM $SDK_PP_FLAGS -MMD ${TARGET}.c",
"$SDK_COMSTR",
"$SDK_AMALGAMATE_PP_COMSTR",
),
],
suffix=".i",
),
"SDKTree": Builder(
"SDKHeaderTreeExtractor": Builder(
action=Action(
deploy_sdk_tree_action,
"$SDKDEPLOY_COMSTR",
deploy_sdk_header_tree_action,
"$SDKTREE_COMSTR",
),
emitter=deploy_sdk_tree_emitter,
emitter=deploy_sdk_header_tree_emitter,
src_suffix=".d",
),
"SDKSymUpdater": Builder(
"ApiTableValidator": Builder(
action=Action(
validate_sdk_cache,
validate_api_cache,
"$SDKSYM_UPDATER_COMSTR",
),
suffix=".csv",
src_suffix=".i",
),
"SDKSymGenerator": Builder(
"ApiSymbolTable": Builder(
action=Action(
generate_sdk_symbols,
"$SDKSYM_GENERATOR_COMSTR",
generate_api_table,
"$APITABLE_GENERATOR_COMSTR",
),
suffix=".h",
src_suffix=".csv",

View File

@ -1,4 +1,6 @@
import SCons.Warnings as Warnings
from SCons.Errors import UserError
# from SCons.Script.Main import find_deepest_user_frame
@ -36,6 +38,11 @@ def fbt_warning(e):
def generate(env):
if env.get("UFBT_WORK_DIR"):
raise UserError(
"You're trying to use a new format SDK on a legacy ufbt version. "
"Please update ufbt to a version from PyPI: https://pypi.org/project/ufbt/"
)
Warnings._warningOut = fbt_warning

View File

@ -56,11 +56,11 @@ class StorageErrorCode(enum.Enum):
class FlipperStorageException(Exception):
def __init__(self, message):
super().__init__(f"Storage error: {message}")
def __init__(self, path: str, error_code: StorageErrorCode):
super().__init__(f"Storage error: path '{path}': {error_code.value}")
@staticmethod
def from_error_code(path: str, error_code: StorageErrorCode):
return FlipperStorageException(
f"Storage error: path '{path}': {error_code.value}"
)
class BufferedRead:
@ -247,7 +247,9 @@ class FlipperStorage:
if self.has_error(answer):
last_error = self.get_error(answer)
self.read.until(self.CLI_PROMPT)
raise FlipperStorageException(filename_to, last_error)
raise FlipperStorageException.from_error_code(
filename_to, last_error
)
self.port.write(filedata)
self.read.until(self.CLI_PROMPT)
@ -319,7 +321,7 @@ class FlipperStorage:
StorageErrorCode.INVALID_NAME,
):
return False
raise FlipperStorageException(path, error_code)
raise FlipperStorageException.from_error_code(path, error_code)
return True
@ -333,7 +335,7 @@ class FlipperStorage:
def _check_no_error(self, response, path=None):
if self.has_error(response):
raise FlipperStorageException(self.get_error(response))
raise FlipperStorageException.from_error_code(self.get_error(response))
def size(self, path: str):
"""file size on Flipper"""

View File

@ -31,9 +31,10 @@ def parse_args():
def get_commit_json(event):
context = ssl._create_unverified_context()
with urllib.request.urlopen(
event["pull_request"]["_links"]["commits"]["href"], context=context
) as commit_file:
commit_url = event["pull_request"]["base"]["repo"]["commits_url"].replace(
"{/sha}", f"/{event['after']}"
)
with urllib.request.urlopen(commit_url, context=context) as commit_file:
commit_json = json.loads(commit_file.read().decode("utf-8"))
return commit_json
@ -43,8 +44,8 @@ def get_details(event, args):
current_time = datetime.datetime.utcnow().date()
if args.type == "pull":
commit_json = get_commit_json(event)
data["commit_comment"] = shlex.quote(commit_json[-1]["commit"]["message"])
data["commit_hash"] = commit_json[-1]["sha"]
data["commit_comment"] = shlex.quote(commit_json["commit"]["message"])
data["commit_hash"] = commit_json["sha"]
ref = event["pull_request"]["head"]["ref"]
data["pull_id"] = event["pull_request"]["number"]
data["pull_name"] = shlex.quote(event["pull_request"]["title"])

View File

@ -1,13 +1,15 @@
#!/usr/bin/env python3
from flipper.app import App
from os.path import join, exists, relpath
from os import makedirs, walk
from update import Main as UpdateMain
import json
import shutil
import zipfile
import tarfile
import zipfile
from os import makedirs, walk
from os.path import exists, join, relpath, basename, split
from ansi.color import fg
from flipper.app import App
from update import Main as UpdateMain
class ProjectDir:
@ -54,12 +56,19 @@ class Main(App):
if project_name == "firmware" and filetype != "elf":
project_name = "full"
return self.get_dist_file_name(project_name, filetype)
dist_target_path = self.get_dist_file_name(project_name, filetype)
self.note_dist_component(
project_name, filetype, self.get_dist_path(dist_target_path)
)
return dist_target_path
def note_dist_component(self, component: str, extension: str, srcpath: str) -> None:
self._dist_components[f"{component}.{extension}"] = srcpath
def get_dist_file_name(self, dist_artifact_type: str, filetype: str) -> str:
return f"{self.DIST_FILE_PREFIX}{self.target}-{dist_artifact_type}-{self.args.suffix}.{filetype}"
def get_dist_file_path(self, filename: str) -> str:
def get_dist_path(self, filename: str) -> str:
return join(self.output_dir_path, filename)
def copy_single_project(self, project: ProjectDir) -> None:
@ -69,17 +78,15 @@ class Main(App):
if exists(src_file := join(obj_directory, f"{project.project}.{filetype}")):
shutil.copyfile(
src_file,
self.get_dist_file_path(
self.get_project_file_name(project, filetype)
),
self.get_dist_path(self.get_project_file_name(project, filetype)),
)
for foldertype in ("sdk", "lib"):
for foldertype in ("sdk_headers", "lib"):
if exists(sdk_folder := join(obj_directory, foldertype)):
self.package_zip(foldertype, sdk_folder)
self.note_dist_component(foldertype, "dir", sdk_folder)
def package_zip(self, foldertype, sdk_folder):
with zipfile.ZipFile(
self.get_dist_file_path(self.get_dist_file_name(foldertype, "zip")),
self.get_dist_path(self.get_dist_file_name(foldertype, "zip")),
"w",
zipfile.ZIP_DEFLATED,
) as zf:
@ -94,7 +101,8 @@ class Main(App):
)
def copy(self) -> int:
self.projects = dict(
self._dist_components: dict[str, str] = dict()
self.projects: dict[str, ProjectDir] = dict(
map(
lambda pd: (pd.project, pd),
map(ProjectDir, self.args.project),
@ -122,12 +130,18 @@ class Main(App):
try:
shutil.rmtree(self.output_dir_path)
except Exception as ex:
pass
self.logger.warn(f"Failed to clean output directory: {ex}")
if not exists(self.output_dir_path):
self.logger.debug(f"Creating output directory {self.output_dir_path}")
makedirs(self.output_dir_path)
for folder in ("debug", "scripts"):
if exists(folder):
self.note_dist_component(folder, "dir", folder)
for project in self.projects.values():
self.logger.debug(f"Copying {project.project} for {project.target}")
self.copy_single_project(project)
self.logger.info(
@ -137,59 +151,133 @@ class Main(App):
)
if self.args.version:
bundle_dir_name = f"{self.target}-update-{self.args.suffix}"[
: self.DIST_FOLDER_MAX_NAME_LENGTH
]
bundle_dir = join(self.output_dir_path, bundle_dir_name)
bundle_args = [
"generate",
"-d",
bundle_dir,
"-v",
self.args.version,
"-t",
self.target,
"--dfu",
self.get_dist_file_path(
self.get_project_file_name(self.projects["firmware"], "dfu")
),
"--stage",
self.get_dist_file_path(
self.get_project_file_name(self.projects["updater"], "bin")
),
]
if self.args.resources:
bundle_args.extend(
(
"-r",
self.args.resources,
)
)
bundle_args.extend(self.other_args)
if bundle_result := self.bundle_update_package():
return bundle_result
if (bundle_result := UpdateMain(no_exit=True)(bundle_args)) == 0:
self.logger.info(
fg.boldgreen(
f"Use this directory to self-update your Flipper:\n\t{bundle_dir}"
)
)
# Create tgz archive
with tarfile.open(
join(
self.output_dir_path,
f"{self.DIST_FILE_PREFIX}{bundle_dir_name}.tgz",
),
"w:gz",
compresslevel=9,
format=tarfile.USTAR_FORMAT,
) as tar:
tar.add(bundle_dir, arcname=bundle_dir_name)
return bundle_result
required_components = ("firmware.elf", "full.bin", "update.dir")
if all(
map(
lambda c: c in self._dist_components,
required_components,
)
):
self.bundle_sdk()
return 0
def bundle_sdk(self):
self.logger.info("Bundling SDK")
components_paths = dict()
sdk_components_keys = (
"full.bin",
"firmware.elf",
"update.dir",
"sdk_headers.dir",
"lib.dir",
"debug.dir",
"scripts.dir",
)
with zipfile.ZipFile(
self.get_dist_path(self.get_dist_file_name("sdk", "zip")),
"w",
zipfile.ZIP_DEFLATED,
) as zf:
for component_key in sdk_components_keys:
component_path = self._dist_components.get(component_key)
components_paths[component_key] = basename(component_path)
if component_key.endswith(".dir"):
for root, dirnames, files in walk(component_path):
if "__pycache__" in dirnames:
dirnames.remove("__pycache__")
for file in files:
zf.write(
join(root, file),
join(
components_paths[component_key],
relpath(
join(root, file),
component_path,
),
),
)
else:
zf.write(component_path, basename(component_path))
zf.writestr(
"components.json",
json.dumps(
{
"meta": {
"hw_target": self.target,
"flavor": self.flavor,
"version": self.args.version,
},
"components": components_paths,
}
),
)
def bundle_update_package(self):
self.logger.debug(
f"Generating update bundle with version {self.args.version} for {self.target}"
)
bundle_dir_name = f"{self.target}-update-{self.args.suffix}"[
: self.DIST_FOLDER_MAX_NAME_LENGTH
]
bundle_dir = self.get_dist_path(bundle_dir_name)
bundle_args = [
"generate",
"-d",
bundle_dir,
"-v",
self.args.version,
"-t",
self.target,
"--dfu",
self.get_dist_path(
self.get_project_file_name(self.projects["firmware"], "dfu")
),
"--stage",
self.get_dist_path(
self.get_project_file_name(self.projects["updater"], "bin")
),
]
if self.args.resources:
bundle_args.extend(
(
"-r",
self.args.resources,
)
)
bundle_args.extend(self.other_args)
if (bundle_result := UpdateMain(no_exit=True)(bundle_args)) == 0:
self.note_dist_component("update", "dir", bundle_dir)
self.logger.info(
fg.boldgreen(
f"Use this directory to self-update your Flipper:\n\t{bundle_dir}"
)
)
# Create tgz archive
with tarfile.open(
join(
self.output_dir_path,
bundle_tgz := f"{self.DIST_FILE_PREFIX}{bundle_dir_name}.tgz",
),
"w:gz",
compresslevel=9,
format=tarfile.USTAR_FORMAT,
) as tar:
self.note_dist_component(
"update", "tgz", self.get_dist_path(bundle_tgz)
)
tar.add(bundle_dir, arcname=bundle_dir_name)
return bundle_result
if __name__ == "__main__":
Main()()

View File

@ -8,7 +8,7 @@ import sys
def main():
logger = logging.getLogger()
if not (port := resolve_port(logger, "auto")):
logger.error("Is Flipper connected over USB and isn't in DFU mode?")
logger.error("Is Flipper connected over USB and is it not in DFU mode?")
return 1
subprocess.call(
[

View File

@ -15,10 +15,12 @@ if not ["%FBT_NOENV%"] == [""] (
set "FLIPPER_TOOLCHAIN_VERSION=21"
if ["%FBT_TOOLCHAIN_ROOT%"] == [""] (
set "FBT_TOOLCHAIN_ROOT=%FBT_ROOT%\toolchain\x86_64-windows"
if ["%FBT_TOOLCHAIN_PATH%"] == [""] (
set "FBT_TOOLCHAIN_PATH=%FBT_ROOT%"
)
set "FBT_TOOLCHAIN_ROOT=%FBT_TOOLCHAIN_PATH%\toolchain\x86_64-windows"
set "FBT_TOOLCHAIN_VERSION_FILE=%FBT_TOOLCHAIN_ROOT%\VERSION"
if not exist "%FBT_TOOLCHAIN_ROOT%" (

View File

@ -4,9 +4,15 @@
# public variables
DEFAULT_SCRIPT_PATH="$(pwd -P)";
SCRIPT_PATH="${SCRIPT_PATH:-$DEFAULT_SCRIPT_PATH}";
FBT_TOOLCHAIN_VERSION="${FBT_TOOLCHAIN_VERSION:-"21"}";
FBT_TOOLCHAIN_PATH="${FBT_TOOLCHAIN_PATH:-$SCRIPT_PATH}";
if [ -z ${FBT_TOOLCHAIN_PATH+x} ] ; then
FBT_TOOLCHAIN_PATH_WAS_SET=0;
else
FBT_TOOLCHAIN_PATH_WAS_SET=1;
fi
FBT_TOOLCHAIN_PATH="${FBT_TOOLCHAIN_PATH:-$DEFAULT_SCRIPT_PATH}";
FBT_VERBOSE="${FBT_VERBOSE:-""}";
fbtenv_show_usage()
@ -60,7 +66,6 @@ fbtenv_restore_env()
unset SAVED_PYTHONPATH;
unset SAVED_PYTHONHOME;
unset SCRIPT_PATH;
unset FBT_TOOLCHAIN_VERSION;
unset FBT_TOOLCHAIN_PATH;
}
@ -104,13 +109,14 @@ fbtenv_set_shell_prompt()
return 0; # all other shells
}
fbtenv_check_script_path()
fbtenv_check_env_vars()
{
if [ ! -x "$SCRIPT_PATH/fbt" ] && [ ! -x "$SCRIPT_PATH/ufbt" ] ; then
echo "Please source this script from [u]fbt root directory, or specify 'SCRIPT_PATH' variable manually";
# Return error if FBT_TOOLCHAIN_PATH is not set before script is sourced or if fbt executable is not in DEFAULT_SCRIPT_PATH
if [ "$FBT_TOOLCHAIN_PATH_WAS_SET" -eq 0 ] && [ ! -x "$DEFAULT_SCRIPT_PATH/fbt" ] && [ ! -x "$DEFAULT_SCRIPT_PATH/ufbt" ] ; then
echo "Please source this script from [u]fbt root directory, or specify 'FBT_TOOLCHAIN_PATH' variable manually";
echo "Example:";
printf "\tSCRIPT_PATH=lang/c/flipperzero-firmware source lang/c/flipperzero-firmware/scripts/fbtenv.sh\n";
echo "If current directory is right, type 'unset SCRIPT_PATH' and try again"
printf "\tFBT_TOOLCHAIN_PATH=lang/c/flipperzero-firmware source lang/c/flipperzero-firmware/scripts/fbtenv.sh\n";
echo "If current directory is right, type 'unset FBT_TOOLCHAIN_PATH' and try again"
return 1;
fi
return 0;
@ -207,7 +213,7 @@ fbtenv_show_unpack_percentage()
fbtenv_unpack_toolchain()
{
echo "Unpacking toolchain:";
echo "Unpacking toolchain to '$FBT_TOOLCHAIN_PATH/toolchain':";
tar -xvf "$FBT_TOOLCHAIN_PATH/toolchain/$TOOLCHAIN_TAR" -C "$FBT_TOOLCHAIN_PATH/toolchain" 2>&1 | fbtenv_show_unpack_percentage;
mkdir -p "$FBT_TOOLCHAIN_PATH/toolchain" || return 1;
mv "$FBT_TOOLCHAIN_PATH/toolchain/$TOOLCHAIN_DIR" "$TOOLCHAIN_ARCH_DIR" || return 1;
@ -215,7 +221,7 @@ fbtenv_unpack_toolchain()
return 0;
}
fbtenv_clearing()
fbtenv_cleanup()
{
printf "Cleaning up..";
if [ -n "${FBT_TOOLCHAIN_PATH:-""}" ]; then
@ -270,14 +276,14 @@ fbtenv_download_toolchain()
fbtenv_check_tar || return 1;
TOOLCHAIN_TAR="$(basename "$TOOLCHAIN_URL")";
TOOLCHAIN_DIR="$(echo "$TOOLCHAIN_TAR" | sed "s/-$FBT_TOOLCHAIN_VERSION.tar.gz//g")";
trap fbtenv_clearing 2; # trap will be restored in fbtenv_clearing
trap fbtenv_cleanup 2; # trap will be restored in fbtenv_cleanup
if ! fbtenv_check_downloaded_toolchain; then
fbtenv_curl_wget_check || return 1;
fbtenv_download_toolchain_tar || return 1;
fi
fbtenv_remove_old_tooclhain;
fbtenv_unpack_toolchain || return 1;
fbtenv_clearing;
fbtenv_cleanup;
return 0;
}
@ -296,8 +302,8 @@ fbtenv_main()
fbtenv_restore_env;
return 0;
fi
fbtenv_check_if_sourced_multiple_times; # many source it's just a warning
fbtenv_check_script_path || return 1;
fbtenv_check_if_sourced_multiple_times;
fbtenv_check_env_vars || return 1;
fbtenv_check_download_toolchain || return 1;
fbtenv_set_shell_prompt;
fbtenv_print_version;

393
scripts/ufbt/SConstruct Normal file
View File

@ -0,0 +1,393 @@
from SCons.Platform import TempFileMunge
from SCons.Node import FS
from SCons.Errors import UserError
import os
import multiprocessing
import pathlib
SetOption("num_jobs", multiprocessing.cpu_count())
SetOption("max_drift", 1)
# SetOption("silent", False)
ufbt_state_dir = Dir(os.environ.get("UFBT_STATE_DIR", "#.ufbt"))
ufbt_script_dir = Dir(os.environ.get("UFBT_SCRIPT_DIR"))
ufbt_current_sdk_dir = ufbt_state_dir.Dir("current")
SConsignFile(ufbt_state_dir.File(".sconsign.dblite").abspath)
ufbt_variables = SConscript("commandline.scons")
forward_os_env = {
# Import PATH from OS env - scons doesn't do that by default
"PATH": os.environ["PATH"],
}
# Proxying environment to child processes & scripts
variables_to_forward = [
# CI/CD variables
"WORKFLOW_BRANCH_OR_TAG",
"DIST_SUFFIX",
# Python & other tools
"HOME",
"APPDATA",
"PYTHONHOME",
"PYTHONNOUSERSITE",
"TMP",
"TEMP",
# Colors for tools
"TERM",
]
if proxy_env := GetOption("proxy_env"):
variables_to_forward.extend(proxy_env.split(","))
for env_value_name in variables_to_forward:
if environ_value := os.environ.get(env_value_name, None):
forward_os_env[env_value_name] = environ_value
# Core environment init - loads SDK state, sets up paths, etc.
core_env = Environment(
variables=ufbt_variables,
ENV=forward_os_env,
UFBT_STATE_DIR=ufbt_state_dir,
UFBT_CURRENT_SDK_DIR=ufbt_current_sdk_dir,
UFBT_SCRIPT_DIR=ufbt_script_dir,
toolpath=[ufbt_current_sdk_dir.Dir("scripts/ufbt/site_tools")],
tools=[
"ufbt_state",
("ufbt_help", {"vars": ufbt_variables}),
],
)
if "update" in BUILD_TARGETS:
SConscript(
"update.scons",
exports={"core_env": core_env},
)
if "purge" in BUILD_TARGETS:
core_env.Execute(Delete(ufbt_state_dir))
print("uFBT state purged")
Exit(0)
# Now we can import stuff bundled with SDK - it was added to sys.path by ufbt_state
from fbt.util import (
tempfile_arg_esc_func,
single_quote,
extract_abs_dir,
extract_abs_dir_path,
wrap_tempfile,
path_as_posix,
)
from fbt.appmanifest import FlipperAppType
from fbt.sdk.cache import SdkCache
# Base environment with all tools loaded from SDK
env = core_env.Clone(
toolpath=[core_env["FBT_SCRIPT_DIR"].Dir("fbt_tools")],
tools=[
"fbt_tweaks",
(
"crosscc",
{
"toolchain_prefix": "arm-none-eabi-",
"versions": (" 10.3",),
},
),
"fwbin",
"python3",
"sconsrecursiveglob",
"sconsmodular",
"ccache",
"fbt_apps",
"fbt_extapps",
"fbt_assets",
("compilation_db", {"COMPILATIONDB_COMSTR": "\tCDB\t${TARGET}"}),
],
FBT_FAP_DEBUG_ELF_ROOT=ufbt_state_dir.Dir("build"),
TEMPFILE=TempFileMunge,
MAXLINELENGTH=2048,
PROGSUFFIX=".elf",
TEMPFILEARGESCFUNC=tempfile_arg_esc_func,
SINGLEQUOTEFUNC=single_quote,
ABSPATHGETTERFUNC=extract_abs_dir_path,
APPS=[],
UFBT_API_VERSION=SdkCache(
core_env.subst("$SDK_DEFINITION"), load_version_only=True
).version,
APPCHECK_COMSTR="\tAPPCHK\t${SOURCE}\n\t\tTarget: ${TARGET_HW}, API: ${UFBT_API_VERSION}",
)
wrap_tempfile(env, "LINKCOM")
wrap_tempfile(env, "ARCOM")
# print(env.Dump())
# Dist env
dist_env = env.Clone(
tools=[
"fbt_dist",
"fbt_debugopts",
"openocd",
"blackmagic",
"jflash",
"textfile",
],
ENV=os.environ,
OPENOCD_OPTS=[
"-f",
"interface/stlink.cfg",
"-c",
"transport select hla_swd",
"-f",
"${FBT_DEBUG_DIR}/stm32wbx.cfg",
"-c",
"stm32wbx.cpu configure -rtos auto",
],
)
openocd_target = dist_env.OpenOCDFlash(
dist_env["UFBT_STATE_DIR"].File("flash"),
dist_env["FW_BIN"],
OPENOCD_COMMAND=[
"-c",
"program ${SOURCE.posix} reset exit 0x08000000",
],
)
dist_env.Alias("firmware_flash", openocd_target)
dist_env.Alias("flash", openocd_target)
if env["FORCE"]:
env.AlwaysBuild(openocd_target)
firmware_debug = dist_env.PhonyTarget(
"debug",
"${GDBPYCOM}",
source=dist_env["FW_ELF"],
GDBOPTS="${GDBOPTS_BASE}",
GDBREMOTE="${OPENOCD_GDB_PIPE}",
FBT_FAP_DEBUG_ELF_ROOT=path_as_posix(dist_env.subst("$FBT_FAP_DEBUG_ELF_ROOT")),
)
dist_env.PhonyTarget(
"blackmagic",
"${GDBPYCOM}",
source=dist_env["FW_ELF"],
GDBOPTS="${GDBOPTS_BASE} ${GDBOPTS_BLACKMAGIC}",
GDBREMOTE="${BLACKMAGIC_ADDR}",
FBT_FAP_DEBUG_ELF_ROOT=path_as_posix(dist_env.subst("$FBT_FAP_DEBUG_ELF_ROOT")),
)
dist_env.PhonyTarget(
"flash_blackmagic",
"$GDB $GDBOPTS $SOURCES $GDBFLASH",
source=dist_env["FW_ELF"],
GDBOPTS="${GDBOPTS_BASE} ${GDBOPTS_BLACKMAGIC}",
GDBREMOTE="${BLACKMAGIC_ADDR}",
GDBFLASH=[
"-ex",
"load",
"-ex",
"quit",
],
)
flash_usb_full = dist_env.UsbInstall(
dist_env["UFBT_STATE_DIR"].File("usbinstall"),
[],
)
dist_env.AlwaysBuild(flash_usb_full)
dist_env.Alias("flash_usb", flash_usb_full)
dist_env.Alias("flash_usb_full", flash_usb_full)
# App build environment
appenv = env.Clone(
CCCOM=env["CCCOM"].replace("$CFLAGS", "$CFLAGS_APP $CFLAGS"),
CXXCOM=env["CXXCOM"].replace("$CXXFLAGS", "$CXXFLAGS_APP $CXXFLAGS"),
LINKCOM=env["LINKCOM"].replace("$LINKFLAGS", "$LINKFLAGS_APP $LINKFLAGS"),
COMPILATIONDB_USE_ABSPATH=True,
)
original_app_dir = Dir(appenv.subst("$UFBT_APP_DIR"))
app_mount_point = Dir("#/app/")
app_mount_point.addRepository(original_app_dir)
appenv.LoadAppManifest(app_mount_point)
appenv.PrepareApplicationsBuild()
#######################
apps_artifacts = appenv["EXT_APPS"]
apps_to_build_as_faps = [
FlipperAppType.PLUGIN,
FlipperAppType.EXTERNAL,
]
known_extapps = [
app
for apptype in apps_to_build_as_faps
for app in appenv["APPBUILD"].get_apps_of_type(apptype, True)
]
for app in known_extapps:
app_artifacts = appenv.BuildAppElf(app)
app_src_dir = extract_abs_dir(app_artifacts.app._appdir)
app_artifacts.installer = [
appenv.Install(app_src_dir.Dir("dist"), app_artifacts.compact),
appenv.Install(app_src_dir.Dir("dist").Dir("debug"), app_artifacts.debug),
]
if appenv["FORCE"]:
appenv.AlwaysBuild([extapp.compact for extapp in apps_artifacts.values()])
# Final steps - target aliases
install_and_check = [
(extapp.installer, extapp.validator) for extapp in apps_artifacts.values()
]
Alias(
"faps",
install_and_check,
)
Default(install_and_check)
# Compilation database
fwcdb = appenv.CompilationDatabase(
original_app_dir.Dir(".vscode").File("compile_commands.json")
)
AlwaysBuild(fwcdb)
Precious(fwcdb)
NoClean(fwcdb)
if len(apps_artifacts):
Default(fwcdb)
# launch handler
runnable_apps = appenv["APPBUILD"].get_apps_of_type(FlipperAppType.EXTERNAL, True)
app_to_launch = None
if len(runnable_apps) == 1:
app_to_launch = runnable_apps[0].appid
elif len(runnable_apps) > 1:
# more than 1 app - try to find one with matching id
app_to_launch = appenv.subst("$APPID")
def ambiguous_app_call(**kw):
raise UserError(
f"More than one app is runnable: {', '.join(app.appid for app in runnable_apps)}. Please specify an app with APPID=..."
)
if app_to_launch:
appenv.AddAppLaunchTarget(app_to_launch, "launch")
else:
dist_env.PhonyTarget("launch", Action(ambiguous_app_call, None))
# cli handler
appenv.PhonyTarget(
"cli",
'${PYTHON3} "${FBT_SCRIPT_DIR}/serial_cli.py"',
)
# Linter
dist_env.PhonyTarget(
"lint",
"${PYTHON3} ${FBT_SCRIPT_DIR}/lint.py check ${LINT_SOURCES}",
source=original_app_dir.File(".clang-format"),
LINT_SOURCES=[original_app_dir],
)
dist_env.PhonyTarget(
"format",
"${PYTHON3} ${FBT_SCRIPT_DIR}/lint.py format ${LINT_SOURCES}",
source=original_app_dir.File(".clang-format"),
LINT_SOURCES=[original_app_dir],
)
# Prepare vscode environment
def _path_as_posix(path):
return pathlib.Path(path).as_posix()
vscode_dist = []
project_template_dir = dist_env["UFBT_SCRIPT_ROOT"].Dir("project_template")
for template_file in project_template_dir.Dir(".vscode").glob("*"):
vscode_dist.append(
dist_env.Substfile(
original_app_dir.Dir(".vscode").File(template_file.name),
template_file,
SUBST_DICT={
"@UFBT_VSCODE_PATH_SEP@": os.path.pathsep,
"@UFBT_TOOLCHAIN_ARM_TOOLCHAIN_DIR@": pathlib.Path(
dist_env.WhereIs("arm-none-eabi-gcc")
).parent.as_posix(),
"@UFBT_TOOLCHAIN_GCC@": _path_as_posix(
dist_env.WhereIs("arm-none-eabi-gcc")
),
"@UFBT_TOOLCHAIN_GDB_PY@": _path_as_posix(
dist_env.WhereIs("arm-none-eabi-gdb-py")
),
"@UFBT_TOOLCHAIN_OPENOCD@": _path_as_posix(dist_env.WhereIs("openocd")),
"@UFBT_APP_DIR@": _path_as_posix(original_app_dir.abspath),
"@UFBT_ROOT_DIR@": _path_as_posix(Dir("#").abspath),
"@UFBT_DEBUG_DIR@": dist_env["FBT_DEBUG_DIR"],
"@UFBT_DEBUG_ELF_DIR@": _path_as_posix(
dist_env["FBT_FAP_DEBUG_ELF_ROOT"].abspath
),
"@UFBT_FIRMWARE_ELF@": _path_as_posix(dist_env["FW_ELF"].abspath),
},
)
)
for config_file in project_template_dir.glob(".*"):
if isinstance(config_file, FS.Dir):
continue
vscode_dist.append(dist_env.Install(original_app_dir, config_file))
dist_env.Precious(vscode_dist)
dist_env.NoClean(vscode_dist)
dist_env.Alias("vscode_dist", vscode_dist)
# Creating app from base template
dist_env.SetDefault(FBT_APPID=appenv.subst("$APPID") or "template")
app_template_dist = []
for template_file in project_template_dir.Dir("app_template").glob("*"):
dist_file_name = dist_env.subst(template_file.name)
if template_file.name.endswith(".png"):
app_template_dist.append(
dist_env.InstallAs(original_app_dir.File(dist_file_name), template_file)
)
else:
app_template_dist.append(
dist_env.Substfile(
original_app_dir.File(dist_file_name),
template_file,
SUBST_DICT={
"@FBT_APPID@": dist_env.subst("$FBT_APPID"),
},
)
)
AddPostAction(
app_template_dist[-1],
[
Mkdir(original_app_dir.Dir("images")),
Touch(original_app_dir.Dir("images").File(".gitkeep")),
],
)
dist_env.Precious(app_template_dist)
dist_env.NoClean(app_template_dist)
dist_env.Alias("create", app_template_dist)

View File

@ -0,0 +1,90 @@
AddOption(
"--proxy-env",
action="store",
dest="proxy_env",
default="",
help="Comma-separated list of additional environment variables to pass to child SCons processes",
)
AddOption(
"--channel",
action="store",
dest="sdk_channel",
choices=["dev", "rc", "release"],
default="",
help="Release channel to use for SDK",
)
AddOption(
"--branch",
action="store",
dest="sdk_branch",
help="Custom main repo branch to use for SDK",
)
AddOption(
"--hw-target",
action="store",
dest="sdk_target",
help="SDK Hardware target",
)
vars = Variables("ufbt_options.py", ARGUMENTS)
vars.AddVariables(
BoolVariable(
"VERBOSE",
help="Print full commands",
default=False,
),
BoolVariable(
"FORCE",
help="Force target action (for supported targets)",
default=False,
),
# These 2 are inherited from SDK
# BoolVariable(
# "DEBUG",
# help="Enable debug build",
# default=True,
# ),
# BoolVariable(
# "COMPACT",
# help="Optimize for size",
# default=False,
# ),
PathVariable(
"OTHER_ELF",
help="Path to prebuilt ELF file to debug",
validator=PathVariable.PathAccept,
default="",
),
(
"OPENOCD_OPTS",
"Options to pass to OpenOCD",
"",
),
(
"BLACKMAGIC",
"Blackmagic probe location",
"auto",
),
(
"OPENOCD_ADAPTER_SERIAL",
"OpenOCD adapter serial number",
"auto",
),
(
"APPID",
"Application id",
"",
),
PathVariable(
"UFBT_APP_DIR",
help="Application dir to work with",
validator=PathVariable.PathIsDir,
default="",
),
)
Return("vars")

View File

@ -0,0 +1,191 @@
---
Language: Cpp
AccessModifierOffset: -4
AlignAfterOpenBracket: AlwaysBreak
AlignArrayOfStructures: None
AlignConsecutiveMacros: None
AlignConsecutiveAssignments: None
AlignConsecutiveBitFields: None
AlignConsecutiveDeclarations: None
AlignEscapedNewlines: Left
AlignOperands: Align
AlignTrailingComments: false
AllowAllArgumentsOnNextLine: true
AllowAllParametersOfDeclarationOnNextLine: false
AllowShortEnumsOnASingleLine: true
AllowShortBlocksOnASingleLine: Never
AllowShortCaseLabelsOnASingleLine: false
AllowShortFunctionsOnASingleLine: None
AllowShortLambdasOnASingleLine: All
AllowShortIfStatementsOnASingleLine: WithoutElse
AllowShortLoopsOnASingleLine: true
AlwaysBreakAfterDefinitionReturnType: None
AlwaysBreakAfterReturnType: None
AlwaysBreakBeforeMultilineStrings: false
AlwaysBreakTemplateDeclarations: Yes
AttributeMacros:
- __capability
BinPackArguments: false
BinPackParameters: false
BraceWrapping:
AfterCaseLabel: false
AfterClass: false
AfterControlStatement: Never
AfterEnum: false
AfterFunction: false
AfterNamespace: false
AfterObjCDeclaration: false
AfterStruct: false
AfterUnion: false
AfterExternBlock: false
BeforeCatch: false
BeforeElse: false
BeforeLambdaBody: false
BeforeWhile: false
IndentBraces: false
SplitEmptyFunction: true
SplitEmptyRecord: true
SplitEmptyNamespace: true
BreakBeforeBinaryOperators: None
BreakBeforeConceptDeclarations: true
BreakBeforeBraces: Attach
BreakBeforeInheritanceComma: false
BreakInheritanceList: BeforeColon
BreakBeforeTernaryOperators: false
BreakConstructorInitializersBeforeComma: false
BreakConstructorInitializers: BeforeComma
BreakAfterJavaFieldAnnotations: false
BreakStringLiterals: false
ColumnLimit: 99
CommentPragmas: '^ IWYU pragma:'
QualifierAlignment: Leave
CompactNamespaces: false
ConstructorInitializerIndentWidth: 4
ContinuationIndentWidth: 4
Cpp11BracedListStyle: true
DeriveLineEnding: true
DerivePointerAlignment: false
DisableFormat: false
EmptyLineAfterAccessModifier: Never
EmptyLineBeforeAccessModifier: LogicalBlock
ExperimentalAutoDetectBinPacking: false
PackConstructorInitializers: BinPack
BasedOnStyle: ''
ConstructorInitializerAllOnOneLineOrOnePerLine: false
AllowAllConstructorInitializersOnNextLine: true
FixNamespaceComments: false
ForEachMacros:
- foreach
- Q_FOREACH
- BOOST_FOREACH
IfMacros:
- KJ_IF_MAYBE
IncludeBlocks: Preserve
IncludeCategories:
- Regex: '.*'
Priority: 1
SortPriority: 0
CaseSensitive: false
- Regex: '^(<|"(gtest|gmock|isl|json)/)'
Priority: 3
SortPriority: 0
CaseSensitive: false
- Regex: '.*'
Priority: 1
SortPriority: 0
CaseSensitive: false
IncludeIsMainRegex: '(Test)?$'
IncludeIsMainSourceRegex: ''
IndentAccessModifiers: false
IndentCaseLabels: false
IndentCaseBlocks: false
IndentGotoLabels: true
IndentPPDirectives: None
IndentExternBlock: AfterExternBlock
IndentRequires: false
IndentWidth: 4
IndentWrappedFunctionNames: true
InsertTrailingCommas: None
JavaScriptQuotes: Leave
JavaScriptWrapImports: true
KeepEmptyLinesAtTheStartOfBlocks: false
LambdaBodyIndentation: Signature
MacroBlockBegin: ''
MacroBlockEnd: ''
MaxEmptyLinesToKeep: 1
NamespaceIndentation: None
ObjCBinPackProtocolList: Auto
ObjCBlockIndentWidth: 4
ObjCBreakBeforeNestedBlockParam: true
ObjCSpaceAfterProperty: true
ObjCSpaceBeforeProtocolList: true
PenaltyBreakAssignment: 10
PenaltyBreakBeforeFirstCallParameter: 30
PenaltyBreakComment: 10
PenaltyBreakFirstLessLess: 0
PenaltyBreakOpenParenthesis: 0
PenaltyBreakString: 10
PenaltyBreakTemplateDeclaration: 10
PenaltyExcessCharacter: 100
PenaltyReturnTypeOnItsOwnLine: 60
PenaltyIndentedWhitespace: 0
PointerAlignment: Left
PPIndentWidth: -1
ReferenceAlignment: Pointer
ReflowComments: false
RemoveBracesLLVM: false
SeparateDefinitionBlocks: Leave
ShortNamespaceLines: 1
SortIncludes: Never
SortJavaStaticImport: Before
SortUsingDeclarations: false
SpaceAfterCStyleCast: false
SpaceAfterLogicalNot: false
SpaceAfterTemplateKeyword: true
SpaceBeforeAssignmentOperators: true
SpaceBeforeCaseColon: false
SpaceBeforeCpp11BracedList: false
SpaceBeforeCtorInitializerColon: true
SpaceBeforeInheritanceColon: true
SpaceBeforeParens: Never
SpaceBeforeParensOptions:
AfterControlStatements: false
AfterForeachMacros: false
AfterFunctionDefinitionName: false
AfterFunctionDeclarationName: false
AfterIfMacros: false
AfterOverloadedOperator: false
BeforeNonEmptyParentheses: false
SpaceAroundPointerQualifiers: Default
SpaceBeforeRangeBasedForLoopColon: true
SpaceInEmptyBlock: false
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 1
SpacesInAngles: Never
SpacesInConditionalStatement: false
SpacesInContainerLiterals: false
SpacesInCStyleCastParentheses: false
SpacesInLineCommentPrefix:
Minimum: 1
Maximum: -1
SpacesInParentheses: false
SpacesInSquareBrackets: false
SpaceBeforeSquareBrackets: false
BitFieldColonSpacing: Both
Standard: c++03
StatementAttributeLikeMacros:
- Q_EMIT
StatementMacros:
- Q_UNUSED
- QT_REQUIRE_VERSION
TabWidth: 4
UseCRLF: false
UseTab: Never
WhitespaceSensitiveMacros:
- STRINGIZE
- PP_STRINGIZE
- BOOST_PP_STRINGIZE
- NS_SWIFT_NAME
- CF_SWIFT_NAME
...

View File

@ -0,0 +1,13 @@
root = true
[*]
end_of_line = lf
insert_final_newline = true
charset = utf-8
[*.{cpp,h,c,py,sh}]
indent_style = space
indent_size = 4
[{Makefile,*.mk}]
indent_size = tab

View File

@ -0,0 +1,4 @@
dist/*
.vscode
.clang-format
.editorconfig

View File

@ -0,0 +1,14 @@
{
"configurations": [
{
"name": "main",
"compilerPath": "@UFBT_TOOLCHAIN_GCC@",
"intelliSenseMode": "gcc-arm",
"compileCommands": "${workspaceFolder}/.vscode/compile_commands.json",
"configurationProvider": "ms-vscode.cpptools",
"cStandard": "gnu17",
"cppStandard": "c++17"
},
],
"version": 4
}

View File

@ -0,0 +1,18 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=827846 to learn about workspace recommendations.
// Extension identifier format: ${publisher}.${name}. Example: vscode.csharp
// List of extensions which should be recommended for users of this workspace.
"recommendations": [
"ms-python.black-formatter",
"ms-vscode.cpptools",
"amiralizadeh9480.cpp-helper",
"marus25.cortex-debug",
"zxh404.vscode-proto3",
"augustocdias.tasks-shell-input"
],
// List of extensions recommended by VS Code that should not be recommended for users of this workspace.
"unwantedRecommendations": [
"twxs.cmake",
"ms-vscode.cmake-tools"
]
}

View File

@ -0,0 +1,98 @@
{
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"inputs": [
// {
// "id": "BLACKMAGIC",
// "type": "command",
// "command": "shellCommand.execute",
// "args": {
// "useSingleResult": true,
// "env": {
// "PATH": "${workspaceFolder};${env:PATH}"
// },
// "command": "./fbt get_blackmagic",
// "description": "Get Blackmagic device",
// }
// },
],
"configurations": [
{
"name": "Attach FW (ST-Link)",
"cwd": "${workspaceFolder}",
"executable": "@UFBT_FIRMWARE_ELF@",
"request": "attach",
"type": "cortex-debug",
"servertype": "openocd",
"device": "stlink",
"svdFile": "@UFBT_DEBUG_DIR@/STM32WB55_CM4.svd",
"rtos": "FreeRTOS",
"configFiles": [
"interface/stlink.cfg",
"@UFBT_DEBUG_DIR@/stm32wbx.cfg"
],
"postAttachCommands": [
"source @UFBT_DEBUG_DIR@/flipperapps.py",
"fap-set-debug-elf-root @UFBT_DEBUG_ELF_DIR@"
],
// "showDevDebugOutput": "raw",
},
{
"name": "Attach FW (DAP)",
"cwd": "${workspaceFolder}",
"executable": "@UFBT_FIRMWARE_ELF@",
"request": "attach",
"type": "cortex-debug",
"servertype": "openocd",
"device": "cmsis-dap",
"svdFile": "@UFBT_DEBUG_DIR@/STM32WB55_CM4.svd",
"rtos": "FreeRTOS",
"configFiles": [
"interface/cmsis-dap.cfg",
"@UFBT_DEBUG_DIR@/stm32wbx.cfg"
],
"postAttachCommands": [
"source @UFBT_DEBUG_DIR@/flipperapps.py",
"fap-set-debug-elf-root @UFBT_DEBUG_ELF_DIR@"
],
// "showDevDebugOutput": "raw",
},
// {
// "name": "Attach FW (blackmagic)",
// "cwd": "${workspaceFolder}",
// "executable": "@UFBT_FIRMWARE_ELF@",
// "request": "attach",
// "type": "cortex-debug",
// "servertype": "external",
// "gdbTarget": "${input:BLACKMAGIC}",
// "svdFile": "@UFBT_DEBUG_DIR@/STM32WB55_CM4.svd",
// "rtos": "FreeRTOS",
// "postAttachCommands": [
// "monitor swdp_scan",
// "attach 1",
// "set confirm off",
// "set mem inaccessible-by-default off",
// "source @UFBT_DEBUG_DIR@/flipperapps.py",
// "fap-set-debug-elf-root @UFBT_DEBUG_ELF_DIR@"
// ]
// // "showDevDebugOutput": "raw",
// },
{
"name": "Attach FW (JLink)",
"cwd": "${workspaceFolder}",
"executable": "@UFBT_FIRMWARE_ELF@",
"request": "attach",
"type": "cortex-debug",
"servertype": "jlink",
"interface": "swd",
"device": "STM32WB55RG",
"svdFile": "@UFBT_DEBUG_DIR@/STM32WB55_CM4.svd",
"rtos": "FreeRTOS",
"postAttachCommands": [
"source @UFBT_DEBUG_DIR@/flipperapps.py",
"fap-set-debug-elf-root @UFBT_DEBUG_ELF_DIR@"
]
// "showDevDebugOutput": "raw",
},
]
}

View File

@ -0,0 +1,20 @@
{
"cortex-debug.enableTelemetry": false,
"cortex-debug.variableUseNaturalFormat": false,
"cortex-debug.showRTOS": true,
"cortex-debug.armToolchainPath": "@UFBT_TOOLCHAIN_ARM_TOOLCHAIN_DIR@",
"cortex-debug.openocdPath": "@UFBT_TOOLCHAIN_OPENOCD@",
"cortex-debug.gdbPath": "@UFBT_TOOLCHAIN_GDB_PY@",
"editor.formatOnSave": true,
"files.associations": {
"*.scons": "python",
"SConscript": "python",
"SConstruct": "python",
"*.fam": "python"
},
"cortex-debug.registerUseNaturalFormat": false,
"python.analysis.typeCheckingMode": "off",
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter"
}
}

View File

@ -0,0 +1,54 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"options": {
"env": {
"PATH": "${workspaceFolder}@UFBT_VSCODE_PATH_SEP@${env:PATH}@UFBT_VSCODE_PATH_SEP@@UFBT_ROOT_DIR@"
}
},
"tasks": [
{
"label": "Launch App on Flipper",
"group": "build",
"type": "shell",
"command": "ufbt launch"
},
{
"label": "Build",
"group": "build",
"type": "shell",
"command": "ufbt"
},
{
"label": "Flash FW (ST-Link)",
"group": "build",
"type": "shell",
"command": "ufbt FORCE=1 flash"
},
// {
// "label": "[NOTIMPL] Flash FW (blackmagic)",
// "group": "build",
// "type": "shell",
// "command": "ufbt flash_blackmagic"
// },
// {
// "label": "[NOTIMPL] Flash FW (JLink)",
// "group": "build",
// "type": "shell",
// "command": "ufbt FORCE=1 jflash"
// },
{
"label": "Flash FW (USB, with resources)",
"group": "build",
"type": "shell",
"command": "ufbt FORCE=1 flash_usb"
},
{
"label": "Update uFBT SDK",
"group": "build",
"type": "shell",
"command": "ufbt update"
}
]
}

View File

@ -0,0 +1,12 @@
#include <furi.h>
/* generated by fbt from .png files in images folder */
#include <@FBT_APPID@_icons.h>
int32_t @FBT_APPID@_app(void* p) {
UNUSED(p);
FURI_LOG_I("TEST", "Hello world");
FURI_LOG_I("TEST", "I'm @FBT_APPID@!");
return 0;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 220 B

View File

@ -0,0 +1,17 @@
# For details & more options, see documentation/AppManifests.md in firmware repo
App(
appid="@FBT_APPID@", # Must be unique
name="App @FBT_APPID@", # Displayed in menus
apptype=FlipperAppType.EXTERNAL,
entry_point="@FBT_APPID@_app",
stack_size=2 * 1024,
fap_category="Misc",
# Optional values
# fap_version=(0, 1), # (major, minor)
fap_icon="@FBT_APPID@.png", # 10x10 1-bit PNG
# fap_description="A simple app",
# fap_author="J. Doe",
# fap_weburl="https://github.com/user/@FBT_APPID@",
fap_icon_assets="images", # Image assets to compile for this application
)

36
scripts/ufbt/site_init.py Normal file
View File

@ -0,0 +1,36 @@
from SCons.Script import GetBuildFailures
import SCons.Errors
import atexit
from ansi.color import fg, fx
def bf_to_str(bf):
"""Convert an element of GetBuildFailures() to a string
in a useful way."""
if bf is None: # unknown targets product None in list
return "(unknown tgt)"
elif isinstance(bf, SCons.Errors.StopError):
return fg.yellow(str(bf))
elif bf.node:
return fg.yellow(str(bf.node)) + ": " + bf.errstr
elif bf.filename:
return fg.yellow(bf.filename) + ": " + bf.errstr
return fg.yellow("unknown failure: ") + bf.errstr
def display_build_status():
"""Display the build status. Called by atexit.
Here you could do all kinds of complicated things."""
bf = GetBuildFailures()
if bf:
# bf is normally a list of build failures; if an element is None,
# it's because of a target that scons doesn't know anything about.
failures_message = "\n".join([bf_to_str(x) for x in bf if x is not None])
print()
print(fg.brightred(fx.bold("*" * 10 + " FBT ERRORS " + "*" * 10)))
print(failures_message)
atexit.register(display_build_status)

View File

@ -0,0 +1,53 @@
targets_help = """Configuration variables:
"""
tail_help = """
TASKS:
(* - not supported yet)
launch:
Upload and start application over USB
vscode_dist:
Configure application in current directory for development in VSCode.
create:
Copy application template to current directory. Set APPID=myapp to create an app with id 'myapp'.
Building:
faps:
Build all FAP apps
fap_{APPID}, launch APPSRC={APPID}:
Build FAP app with appid={APPID}; upload & start it over USB
Flashing & debugging:
flash, flash_blackmagic, *jflash:
Flash firmware to target using debug probe
flash_usb, flash_usb_full:
Install firmware using self-update package
debug, debug_other, blackmagic:
Start GDB
Other:
cli:
Open a Flipper CLI session over USB
lint:
run linter for C code
format:
reformat C code
How to create a new application:
1. Create a new directory for your application and cd into it.
2. Run `ufbt vscode_dist create APPID=myapp`
3. In VSCode, open the folder and start editing.
4. Run `ufbt launch` to build and upload your application.
"""
def generate(env, **kw):
vars = kw["vars"]
basic_help = vars.GenerateHelpText(env)
env.Help(targets_help + basic_help + tail_help)
def exists(env):
return True

View File

@ -0,0 +1,117 @@
from SCons.Errors import StopError
from SCons.Warnings import warn, WarningOnByDefault
import json
import os
import sys
import pathlib
from functools import reduce
def _load_sdk_data(sdk_root):
split_vars = {
"cc_args",
"cpp_args",
"linker_args",
"linker_libs",
}
subst_vars = split_vars | {
"sdk_symbols",
}
sdk_data = {}
with open(os.path.join(sdk_root, "sdk.opts")) as f:
sdk_json_data = json.load(f)
replacements = {
sdk_json_data["app_ep_subst"]: "${APP_ENTRY}",
sdk_json_data["sdk_path_subst"]: sdk_root.replace("\\", "/"),
sdk_json_data["map_file_subst"]: "${TARGET}",
}
def do_value_substs(src_value):
if isinstance(src_value, str):
return reduce(
lambda acc, kv: acc.replace(*kv), replacements.items(), src_value
)
elif isinstance(src_value, list):
return [do_value_substs(v) for v in src_value]
else:
return src_value
for key, value in sdk_json_data.items():
if key in split_vars:
value = value.split()
if key in subst_vars:
value = do_value_substs(value)
sdk_data[key] = value
return sdk_data
def _load_state_file(state_dir_node, filename: str) -> dict:
state_path = os.path.join(state_dir_node.abspath, filename)
if not os.path.exists(state_path):
raise StopError(f"State file {state_path} not found")
with open(state_path, "r") as f:
return json.load(f)
def generate(env, **kw):
sdk_current_sdk_dir_node = env["UFBT_CURRENT_SDK_DIR"]
sdk_components_filename = kw.get("SDK_COMPONENTS", "components.json")
ufbt_state_filename = kw.get("UFBT_STATE", "ufbt_state.json")
sdk_state = _load_state_file(sdk_current_sdk_dir_node, sdk_components_filename)
ufbt_state = _load_state_file(sdk_current_sdk_dir_node, ufbt_state_filename)
if not (sdk_components := sdk_state.get("components", {})):
raise StopError("SDK state file doesn't contain components data")
sdk_data = _load_sdk_data(
sdk_current_sdk_dir_node.Dir(sdk_components["sdk_headers.dir"]).abspath
)
if not sdk_state["meta"]["hw_target"].endswith(sdk_data["hardware"]):
raise StopError("SDK state file doesn't match hardware target")
if sdk_state["meta"]["version"] != ufbt_state["version"]:
warn(
WarningOnByDefault,
f"Version mismatch: SDK state vs uFBT: {sdk_state['meta']['version']} vs {ufbt_state['version']}",
)
scripts_dir = sdk_current_sdk_dir_node.Dir(sdk_components["scripts.dir"])
env.SetDefault(
# Paths
SDK_DEFINITION=env.File(sdk_data["sdk_symbols"]),
FBT_DEBUG_DIR=pathlib.Path(
sdk_current_sdk_dir_node.Dir(sdk_components["debug.dir"]).abspath
).as_posix(),
FBT_SCRIPT_DIR=scripts_dir,
LIBPATH=sdk_current_sdk_dir_node.Dir(sdk_components["lib.dir"]),
FW_ELF=sdk_current_sdk_dir_node.File(sdk_components["firmware.elf"]),
FW_BIN=sdk_current_sdk_dir_node.File(sdk_components["full.bin"]),
UPDATE_BUNDLE_DIR=sdk_current_sdk_dir_node.Dir(sdk_components["update.dir"]),
SVD_FILE="${FBT_DEBUG_DIR}/STM32WB55_CM4.svd",
# Build variables
ROOT_DIR=env.Dir("#"),
FIRMWARE_BUILD_CFG="firmware",
TARGET_HW=int(sdk_data["hardware"]),
CFLAGS_APP=sdk_data["cc_args"],
CXXFLAGS_APP=sdk_data["cpp_args"],
LINKFLAGS_APP=sdk_data["linker_args"],
LIBS=sdk_data["linker_libs"],
# ufbt state
# UFBT_STATE_DIR=ufbt_state_dir_node,
# UFBT_CURRENT_SDK_DIR=sdk_current_sdk_dir_node,
UFBT_STATE=ufbt_state,
UFBT_BOOTSTRAP_SCRIPT="${UFBT_SCRIPT_DIR}/bootstrap.py",
UFBT_SCRIPT_ROOT=scripts_dir.Dir("ufbt"),
)
sys.path.insert(0, env["FBT_SCRIPT_DIR"].abspath)
def exists(env):
return True

37
scripts/ufbt/update.scons Normal file
View File

@ -0,0 +1,37 @@
from SCons.Errors import StopError
Import("core_env")
update_env = core_env.Clone(
toolpath=[core_env["FBT_SCRIPT_DIR"].Dir("fbt_tools")],
tools=["python3"],
)
print("Updating SDK...")
ufbt_state = update_env["UFBT_STATE"]
update_args = [
"--ufbt-dir",
f'"{update_env["UFBT_STATE_DIR"]}"',
]
if branch_name := GetOption("sdk_branch"):
update_args.extend(["--branch", branch_name])
elif channel_name := GetOption("sdk_channel"):
update_args.extend(["--channel", channel_name])
elif branch_name := ufbt_state.get("branch", None):
update_args.extend(["--branch", branch_name])
elif channel_name := ufbt_state.get("channel", None):
update_args.extend(["--channel", channel_name])
else:
raise StopError("No branch or channel specified for SDK update")
if hw_target := GetOption("sdk_target"):
update_args.extend(["--hw-target", hw_target])
else:
update_args.extend(["--hw-target", ufbt_state["hw_target"]])
update_env.Replace(UPDATE_ARGS=update_args)
result = update_env.Execute(
update_env.subst('$PYTHON3 "$UFBT_BOOTSTRAP_SCRIPT" $UPDATE_ARGS'),
)
Exit(result)

View File

@ -112,86 +112,47 @@ Alias(
extapps.resources_dist = appenv.FapDist(appenv["RESOURCES_ROOT"], [])
if appsrc := appenv.subst("$APPSRC"):
deploy_sources, flipp_dist_paths, validators = [], [], []
run_script_extra_ars = ""
appenv.AddAppLaunchTarget(appsrc, "launch_app")
def _add_dist_targets(app_artifacts):
validators.append(app_artifacts.validator)
for _, ext_path in app_artifacts.dist_entries:
deploy_sources.append(app_artifacts.compact)
flipp_dist_paths.append(f"/ext/{ext_path}")
return app_artifacts
def _add_host_app_to_targets(host_app):
artifacts_app_to_run = appenv["EXT_APPS"].get(host_app.appid, None)
_add_dist_targets(artifacts_app_to_run)
for plugin in host_app._plugins:
_add_dist_targets(appenv["EXT_APPS"].get(plugin.appid, None))
artifacts_app_to_run = appenv.GetExtAppByIdOrPath(appsrc)
if artifacts_app_to_run.app.apptype == FlipperAppType.PLUGIN:
# We deploy host app instead
host_app = appenv["APPMGR"].get(artifacts_app_to_run.app.requires[0])
if host_app:
if host_app.apptype == FlipperAppType.EXTERNAL:
_add_host_app_to_targets(host_app)
else:
# host app is a built-in app
run_script_extra_ars = f"-a {host_app.name}"
_add_dist_targets(artifacts_app_to_run)
else:
raise UserError("Host app is unknown")
else:
_add_host_app_to_targets(artifacts_app_to_run.app)
# print(deploy_sources, flipp_dist_paths)
appenv.PhonyTarget(
"launch_app",
'${PYTHON3} "${APP_RUN_SCRIPT}" ${EXTRA_ARGS} -s ${SOURCES} -t ${FLIPPER_FILE_TARGETS}',
source=deploy_sources,
FLIPPER_FILE_TARGETS=flipp_dist_paths,
EXTRA_ARGS=run_script_extra_ars,
)
appenv.Alias("launch_app", validators)
# SDK management
sdk_origin_path = "${BUILD_DIR}/sdk_origin"
sdk_source = appenv.SDKPrebuilder(
sdk_origin_path,
amalgamated_api = "${BUILD_DIR}/sdk_origin"
sdk_source = appenv.ApiAmalgamator(
amalgamated_api,
# Deps on root SDK headers and generated files
(appenv["SDK_HEADERS"], appenv["FW_ASSETS_HEADERS"]),
)
# Extra deps on headers included in deeper levels
# Available on second and subsequent builds
Depends(sdk_source, appenv.ProcessSdkDepends(f"{sdk_origin_path}.d"))
Depends(sdk_source, appenv.ProcessSdkDepends(f"{amalgamated_api}.d"))
appenv["SDK_DIR"] = appenv.Dir("${BUILD_DIR}/sdk")
sdk_tree = appenv.SDKTree(appenv["SDK_DIR"], sdk_origin_path)
appenv["SDK_DIR"] = appenv.Dir("${BUILD_DIR}/sdk_headers")
sdk_header_tree = appenv.SDKHeaderTreeExtractor(appenv["SDK_DIR"], amalgamated_api)
# AlwaysBuild(sdk_tree)
Alias("sdk_tree", sdk_tree)
extapps.sdk_tree = sdk_tree
Alias("sdk_tree", sdk_header_tree)
extapps.sdk_tree = sdk_header_tree
sdk_apicheck = appenv.SDKSymUpdater(appenv["SDK_DEFINITION"], sdk_origin_path)
Precious(sdk_apicheck)
NoClean(sdk_apicheck)
AlwaysBuild(sdk_apicheck)
Alias("sdk_check", sdk_apicheck)
api_check = appenv.ApiTableValidator(appenv["SDK_DEFINITION"], amalgamated_api)
Precious(api_check)
NoClean(api_check)
AlwaysBuild(api_check)
Alias("api_check", api_check)
sdk_apisyms = appenv.SDKSymGenerator(
"${BUILD_DIR}/assets/compiled/symbols.h", appenv["SDK_DEFINITION"]
firmware_apitable = appenv.ApiSymbolTable(
"${BUILD_DIR}/assets/compiled/firmware_api_table.h", appenv["SDK_DEFINITION"]
)
Alias("api_syms", sdk_apisyms)
Alias("api_table", firmware_apitable)
ENV.Replace(
SDK_APISYMS=sdk_apisyms,
FW_API_TABLE=firmware_apitable,
_APP_ICONS=appenv["_APP_ICONS"],
)
if appenv["FORCE"]:
appenv.AlwaysBuild(sdk_source, sdk_tree, sdk_apicheck, sdk_apisyms)
appenv.AlwaysBuild(sdk_source, sdk_header_tree, api_check, firmware_apitable)
Return("extapps")