diff mbox series

[1/1] utils/make_rich_wrapper.py: add new python script to wrap the build process into a colored TUI with progress tracking

Message ID 1166577682.270413.1673277779708@fidget.co-bxl
State Rejected
Headers show
Series [1/1] utils/make_rich_wrapper.py: add new python script to wrap the build process into a colored TUI with progress tracking | expand

Commit Message

Raphaël Slagmolen Jan. 9, 2023, 3:22 p.m. UTC
As the build process can be long and hard to track, I wrote a little Python
script that try to make it easier for users
to see where in the process they are, what is going on, log filtering and
more.

The script is completly autonomous and doesn't impose anything on the current
build process so it should work without
tampering the maintenability of the project's Makefile.

The script has only two dependencies :
    - packaging (for version parsing while checking Buildroot's dependencies)
    - rich (for the colored output and layout)

Features:
    - test the presence of Buildroot's dependencies, with version control and
path validation (for `file`)
    - (optional) test the presence of Buildroot's optional dependencies but
without blocking the process
    - can only do the checking or skip it entirely using command line option
    - filter the output of the build process while trying to detect
fatal/error/warning/debug loglevel (may have some false positive)
    - can change the filtering level using a command line option (default on
ERROR)
    - keep track of the amount of work to do using `make external-deps` and
display the overall progress
    - in case the amount of work isn't correctly detected (I try to account for
some specials steps), the progress is fixed at the end with a message
    - keep track of all the task already done and the one currently working
    - display the current task's step message (extracted from `>>> %s` output)
    - keep track of the current task progress based on the amount of steps
done
    - display a progress in case of `wget` with extraction of filename, size,
and downloaded from the output

The script is launching `make` by itself but the user is supposed to have
already done the configuration needed!

May be added later:
    - export raw output from `make` into a log file in /tmp or elsewhere
(command line option?)
    - add other trackable step like `wget`?
    - improve loglevel detection
    - any request from the team?!

Some images can be seen on my personal server:
    - the `--help` menu: https://naheulcraft.be/nextcloud/s/aX8fW5mK8dbXwqT
    - the progress tracking:
https://naheulcraft.be/nextcloud/s/87GK7yXQzZHydcj
    - the finished process: https://naheulcraft.be/nextcloud/s/ZPMt46WP5jib7mk

Signed-off-by: Raphaël Slagmolen (Tutul) <raphael.slagmolen@mailfence.com>
---
 DEVELOPERS                 |   3 +
 utils/make_rich_wrapper.py | 660 +++++++++++++++++++++++++++++++++++++
 2 files changed, 663 insertions(+)
 create mode 100755 utils/make_rich_wrapper.py

before running',
+                       action='store_true')
+    group.add_argument('--only-check',
+                       help='only do the requirements and optionnal check and
exit',
+                       action='store_true')
+    parser.add_argument('--check-optional',
+                        help='enable checking for optional ressources and
exit',
+                        action='store_true')
+    parser.add_argument('--version',
+                        help='output version information and exit',
+                        action='version',
+                        version=f'%(prog)s {__version__} - {__status__}')
+    parser.add_argument('--about',
+                        help='output about information and exit',
+                        action='store_true')
+
+    arguments = parser.parse_args()
+
+    if arguments.about:
+        console = Console(highlight=False)
+        console.print(f"{__copyright__}\nCreated by {__author__}, license:
{__license__}\n\n{__description__}\n")
+        if "release" == __status__:
+            console.print(f"[green b]{__status__}")
+        elif "test" == __status__:
+            console.print(f"[cyan]{__status__}")
+        elif "dev" == __status__:
+            console.print(f"[purple i]{__status__}")
+        else:
+            console.print(f"[red u]{__status__}")
+        console.print(f"Version: {__version__}\nMaintainer:
{__maintainer__}\n\nCredits: {__credits__}")
+        quit()
+
+    # Initialise the rich console for printing
+    console = Console()
+
+    # Initialing the rich log handler
+    logging.basicConfig(
+        level=arguments.loglevel.upper().replace("ALL",
"NOTSET").replace("WARN", "WARNING"),
+        format="%(message)s",
+        datefmt="[%X]",
+        handlers=[RichHandler(rich_tracebacks=True, show_path=False)]
+    )
+    log = logging.getLogger("rich")
+
+    if not arguments.fast:
+        ok = check_requirements()
+        if not arguments.only_check:
+            if ok:
+                console.rule(style="green")
+            else:
+                quit()
+
+    if not arguments.fast and arguments.check_optional:
+        ok = check_optional_packages()
+        if not arguments.only_check:
+            if ok:
+                console.rule(style="blue")
+            else:
+                console.rule(style="magenta")
+
+    if not arguments.only_check:
+        load_all_tasks()
+        if not arguments.fast and not Confirm.ask("Proceed? (it may take some
times)", console=console, default="y"):
+            quit()
+        console.print("\n\n")
+        main()

Comments

Yann E. MORIN Aug. 12, 2023, 8:32 p.m. UTC | #1
Raphaël, Al,

On 2023-01-09 16:22 +0100, Raphaël Slagmolen via buildroot spake thusly:
> As the build process can be long and hard to track, I wrote a little Python
> script that try to make it easier for users
> to see where in the process they are, what is going on, log filtering and
> more.

Thanks for this proposal, and sorry for the late reply...

We've discussed this a bit with another maintainer, and we are a bit
reluctant at carrying this script in Buildroot.

First, we cant easily review it, because it is badly line-wrapped.

Second, it looks too complicated.

Third, we already have a simpler script that just filters the output of
make, and just displays the lines with '>>>', redirecting the full log
to a file: utils/brmake; this should be enough for a beautification of
the build output.

So, we're going to reject this script.

Regards,
Yann E. MORIN.

> The script is completly autonomous and doesn't impose anything on the current
> build process so it should work without
> tampering the maintenability of the project's Makefile.
> 
> The script has only two dependencies :
>     - packaging (for version parsing while checking Buildroot's dependencies)
>     - rich (for the colored output and layout)
> 
> Features:
>     - test the presence of Buildroot's dependencies, with version control and
> path validation (for `file`)
>     - (optional) test the presence of Buildroot's optional dependencies but
> without blocking the process
>     - can only do the checking or skip it entirely using command line option
>     - filter the output of the build process while trying to detect
> fatal/error/warning/debug loglevel (may have some false positive)
>     - can change the filtering level using a command line option (default on
> ERROR)
>     - keep track of the amount of work to do using `make external-deps` and
> display the overall progress
>     - in case the amount of work isn't correctly detected (I try to account for
> some specials steps), the progress is fixed at the end with a message
>     - keep track of all the task already done and the one currently working
>     - display the current task's step message (extracted from `>>> %s` output)
>     - keep track of the current task progress based on the amount of steps
> done
>     - display a progress in case of `wget` with extraction of filename, size,
> and downloaded from the output
> 
> The script is launching `make` by itself but the user is supposed to have
> already done the configuration needed!
> 
> May be added later:
>     - export raw output from `make` into a log file in /tmp or elsewhere
> (command line option?)
>     - add other trackable step like `wget`?
>     - improve loglevel detection
>     - any request from the team?!
> 
> Some images can be seen on my personal server:
>     - the `--help` menu: https://naheulcraft.be/nextcloud/s/aX8fW5mK8dbXwqT
>     - the progress tracking:
> https://naheulcraft.be/nextcloud/s/87GK7yXQzZHydcj
>     - the finished process: https://naheulcraft.be/nextcloud/s/ZPMt46WP5jib7mk
> 
> Signed-off-by: Raphaël Slagmolen (Tutul) <raphael.slagmolen@mailfence.com>
> ---
>  DEVELOPERS                 |   3 +
>  utils/make_rich_wrapper.py | 660 +++++++++++++++++++++++++++++++++++++
>  2 files changed, 663 insertions(+)
>  create mode 100755 utils/make_rich_wrapper.py
> 
> diff --git a/DEVELOPERS b/DEVELOPERS
> index 86e3f0e7b1..0c78d2f3a1 100644
> --- a/DEVELOPERS
> +++ b/DEVELOPERS
> @@ -3159,3 +3159,6 @@ F:	package/quazip/
>  F:	package/shapelib/
>  F:	package/simple-mail/
>  F:	package/tinc/
> +
> +N:	Raphaël Slagmolen (Tutul) <raphael.slagmolen@mailfence.com>
> +F:	utils/make_rich_wrapper.py
> diff --git a/utils/make_rich_wrapper.py b/utils/make_rich_wrapper.py
> new file mode 100755
> index 0000000000..152b50be65
> --- /dev/null
> +++ b/utils/make_rich_wrapper.py
> @@ -0,0 +1,660 @@
> +#!/usr/bin/env python
> +# -*- coding: utf-8 -*-
> +
> +'''
> +    Date created: 2023/01/07
> +    Date last modified: 2023/01/09
> +    Python Version: 3.10+
> +    Requirement:
> +        packaging==21+
> +        rich==13+
> +'''
> +
> +__author__ = 'Tutul (https://gitlab.com/Tutul)'
> +__copyright__ = 'Copyright 2023, make_rich_wrapper.py for Buildroot'
> +__license__ = 'MIT'
> +__credits__ = ['Tutul']
> +__version__ = '1.0.0'
> +__status__ = 'release'  # release > test > dev
> +__maintainer__ = 'Tutul'
> +__description__ = 'simple python script that act as a wrapper around `make`
> for Buildroot using the Rich module'
> +
> +import argparse
> +import enum
> +import logging
> +import os
> +import re
> +import select
> +import shutil
> +import signal
> +import subprocess
> +from packaging import version
> +from rich import box
> +from rich.console import Console, Group
> +from rich.live import Live
> +from rich.logging import RichHandler
> +from rich.panel import Panel
> +from rich.prompt import Confirm
> +from rich.progress import (
> +    BarColumn,
> +    DownloadColumn,
> +    MofNCompleteColumn,
> +    Progress, SpinnerColumn,
> +    Task,
> +    TaskProgressColumn,
> +    TextColumn,
> +    TimeElapsedColumn,
> +    TimeRemainingColumn,
> +    TransferSpeedColumn
> +)
> +from rich.table import Table
> +
> +############################################################################################
> +
> +
> +class DownloadStep(enum.Enum):
> +    """
> +    Help to know wich downloading information is present
> +    """
> +    LENGTH = 0
> +    OUTPUT = 1
> +    PROGRESS = 2
> +    DONE = 3
> +
> +############################################################################################
> +
> +
> +class State(enum.Enum):
> +    """
> +    Buildroot Makefile's steps
> +    """
> +    READY = 0
> +    DOWNLOADING = 1
> +    EXTRACTING = 2
> +    PATCHING = 3
> +    CONFIGURING = 4
> +    BUILDING = 5
> +    INSTALLING = 6
> +    DONE = 7
> +
> +    @staticmethod
> +    def get_from_message(message) -> str:
> +        if message.__contains__("Downloading"):
> +            return State.DOWNLOADING
> +        elif message.__contains__("Extracting"):
> +            return State.EXTRACTING
> +        elif message.__contains__("Patching"):
> +            return State.PATCHING
> +        elif message.__contains__("Configuring"):
> +            return State.CONFIGURING
> +        elif message.__contains__("Building"):
> +            return State.BUILDING
> +        elif message.__contains__("Installing"):
> +            return State.INSTALLING
> +        else:
> +            return None
> +
> +############################################################################################
> +
> +
> +class Task():
> +    """
> +    Keep track of the current step
> +    """
> +
> +    def __init__(self, current_pkg_progress, step_progress,
> pkg_steps_progress, downloader):
> +        self.current_pkg_progress = current_pkg_progress
> +        self.step_progress = step_progress
> +        self.pkg_steps_progress = pkg_steps_progress
> +        self.downloader = downloader
> +
> +        self.pkg_task_id = None
> +        self.step_task_id = None
> +        self.pkg_step_task_id = None
> +        self.downloading_task_id = None
> +        self.state = None
> +        self.name = None
> +
> +    def __del__(self):
> +        if self.pkg_task_id is not None:
> +            self.current_pkg_progress.stop_task(self.pkg_task_id)
> +            self.pkg_task_id = None
> +        if self.step_task_id is not None:
> +            self.step_progress.stop_task(self.step_task_id)
> +            self.step_progress.update(self.step_task_id, visible=False)
> +            self.step_task_id = None
> +        if self.pkg_step_task_id is not None:
> +            self.pkg_steps_progress.stop_task(self.pkg_step_task_id)
> +            self.pkg_steps_progress.update(self.pkg_step_task_id,
> visible=False)
> +            self.pkg_step_task_id = None
> +        if self.downloading_task_id is not None:
> +            self.downloader.stop_task(self.downloading_task_id)
> +            self.downloader.update(self.downloading_task_id, visible=False)
> +            self.downloading_task_id = None
> +        self.name = None
> +        self.sate = State.DONE
> +
> +    def update(self, new_state, new_step=None) -> None:
> +        self.state = new_state
> +
> +        if self.downloading_task_id is not None and State.DOWNLOADING !=
> self.state:
> +            self.downloader.stop_task(self.downloading_task_id)
> +            self.downloader.update(self.downloading_task_id, visible=False)
> +            self.downloading_task_id = None
> +
> +        if self.step_task_id is not None:
> +            self.step_progress.stop_task(self.step_task_id)
> +            self.step_progress.update(self.step_task_id, visible=False)
> +        if new_step is not None:
> +            self.step_task_id = self.step_progress.add_task("", step=new_step,
> name=self.name)
> +        else:
> +            self.step_task_id = None
> +        if self.pkg_step_task_id is not None:
> +            self.pkg_steps_progress.update(self.pkg_step_task_id,
> completed=self.state.value)
> +
> +    def next(self, new_name, new_state, new_step=None) -> None:
> +        if self.pkg_task_id is not None:
> +            self.current_pkg_progress.stop_task(self.pkg_task_id)
> +            self.current_pkg_progress.update(self.pkg_task_id,
> description="[bold green]%s done!" % self.name)
> +
> +        if self.step_task_id is not None:
> +            self.step_progress.stop_task(self.step_task_id)
> +            self.step_progress.update(self.step_task_id, visible=False)
> +
> +        if self.pkg_step_task_id is not None:
> +            self.pkg_steps_progress.stop_task(self.pkg_step_task_id)
> +            self.pkg_steps_progress.update(self.pkg_step_task_id,
> visible=False)
> +            self.pkg_step_task_id = None
> +
> +        if self.downloading_task_id is not None:
> +            self.downloader.stop_task(self.downloading_task_id)
> +            self.downloader.update(self.downloading_task_id, visible=False)
> +            self.downloading_task_id = None
> +
> +        self.name = new_name
> +        self.state = new_state
> +
> +        if self.state is not None:
> +            self.step_task_id = self.step_progress.add_task("", step=new_step,
> name=self.name)
> +        if "Finishing" == self.name:
> +            self.pkg_task_id = self.current_pkg_progress.add_task("%s" %
> self.name)
> +        else:
> +            self.pkg_step_task_id = self.pkg_steps_progress.add_task("",
> total=State.DONE.value, name=self.name)
> +            self.pkg_task_id = self.current_pkg_progress.add_task("Working on
> %s" % self.name)
> +
> +############################################################################################
> +
> +
> +def handle_task_update(current, message) -> bool:
> +    """
> +    Handle step update and detect task switching
> +    """
> +    finished = False
> +    message_array = message.split(">>> ")[1].split(" ")
> +
> +    name = "[b]" + message_array[0] + "[/b]"
> +    if len(message_array) >= 3:
> +        name = name + " " + message_array[1]
> +        step = message_array[2:]
> +    else:
> +        step = message_array[1:]
> +    if message.__contains__(">>>   "):
> +        name = "Finishing"  # Reset for the last few tasks
> +
> +    step = ' '.join(str(x) for x in step)
> +    state = State.get_from_message(step)
> +
> +    if current.name is None:
> +        current.next(name, state, step)
> +    elif name != current.name:
> +        finished = True
> +        current.update(State.DONE)
> +        if state is None:
> +            state = State.READY
> +        current.next(name, state, step)
> +    elif state is None:
> +        state = current.state
> +
> +    current.update(state, step)
> +    return finished
> +
> +############################################################################################
> +
> +
> +def handle_downloading(current, message, step) -> None:
> +    """
> +    Handle all that is about downloading (wget only for now)
> +    """
> +
> +    log.debug(message)
> +
> +    if current.downloading_task_id is None:
> +        current.downloading_task_id = current.downloader.add_task("",
> total=None, filename="?")
> +
> +    if DownloadStep.LENGTH == step:
> +        current.downloader.update(current.downloading_task_id,
> total=int(message.split("Length: ")[1].split(" (")[0]))
> +    elif DownloadStep.OUTPUT == step:
> +        current.downloader.update(current.downloading_task_id,
> filename=message.split("output' '")[1])
> +    elif DownloadStep.PROGRESS == step:
> +        if message.__contains__(" ."):
> +            downloaded_size = message.split(" .")[0]
> +        else:
> +            downloaded_size = message.split("                                 
>                      ")[0]
> +        downloaded_size.strip().upper().replace("O", "").replace("B",
> "").replace("I", "")
> +        if downloaded_size.__contains__("K"):
> +            downloaded_size = int(downloaded_size.replace("K", "")) * 1000
> +        elif downloaded_size.__contains__("M"):
> +            downloaded_size = int(downloaded_size.replace("M", "")) * 1000000
> +        elif downloaded_size.__contains__("G"):
> +            downloaded_size = int(downloaded_size.replace("G", "")) *
> 1000000000
> +        elif downloaded_size.__contains__("T"):
> +            downloaded_size = int(downloaded_size.replace("T", "")) *
> 1000000000000
> +        current.downloader.update(current.downloading_task_id,
> completed=int(downloaded_size))
> +    elif DownloadStep.DONE == step:
> +        current.downloader.stop_task(current.downloading_task_id)
> +        current.downloader.update(current.downloading_task_id, visible=False)
> +        current.downloading_task_id = None
> +
> +############################################################################################
> +
> +
> +def which(exec_name, package=None, desired_version=None,
> version_tuple_catcher=None, path=None) -> bool:
> +    """
> +    Used to find if a binary is present (and usable) on the system
> +    Can also check for minimal version (as long as the binary use --version)
> +    And may also check if the path is the required one
> +    """
> +    global console
> +    if package is None:
> +        package = exec_name
> +
> +    p = shutil.which(exec_name)
> +    if p is None:
> +        console.print(f"    [red1]:cross_mark: [b]{package}[/b] is required
> but cannot find it in the PATH")
> +        return False
> +    elif path is not None and path != p:
> +        console.print(
> +            f"    [orange_red1]:no_entry: [b]{package}[/b] is found but we
> need it to be in a specific location: {p} --> {path}")
> +        return False
> +    elif desired_version is not None:
> +        if version_tuple_catcher is not None:
> +            local_version = subprocess.run([p, "--version"],
> stdout=subprocess.PIPE, check=True,
> +                                          
> text=True).stdout.split(version_tuple_catcher[0])[1].split(version_tuple_catcher[1])[0]
> +            if version.parse(local_version) < version.parse(desired_version):
> +                console.print(
> +                    f"    [orange1]:exclamation_mark: [b]{package}[/b] is
> found but we require at least the version {desired_version} and you have the
> version {local_version}")
> +                return False
> +        else:
> +            console.print(
> +                f"    [orange1]:exclamation_mark: [b]{package}[/b] is found
> but we require at least the version {desired_version}")
> +            return False
> +    console.print(f"    [green4]:heavy_check_mark:  [b]{package}[/b] is found
> and usable")
> +    return True
> +
> +############################################################################################
> +
> +
> +def test_depencencies(tools_list) -> bool:
> +    """
> +    Simply handle calling which() from a list of tools
> +    """
> +    is_all_ok = True
> +    for tool in tools_list:
> +        if not which(tool[0], tool[1], tool[2], tool[3], tool[4]):
> +            is_all_ok = False
> +    return is_all_ok
> +
> +############################################################################################
> +
> +
> +# which(binary), package[binary], minimal version, version_tuple_catcher,
> required path
> +build_tools_list = [
> +    ("which", None, None, None, None),
> +    ("sed", None, None, None, None),
> +    ("make", None, "3.81", ("GNU Make ", "\n"), None),
> +    ("ld", "binutils", None, None, None),
> +    ("diff", "diffutils", None, None, None),
> +    ("gcc", None, "4.8", ("gcc (GCC) ", "\n"), None),
> +    ("g++", None, "4.8", ("g++ (GCC) ", "\n"), None),
> +    ("bash", None, None, None, None),
> +    ("patch", None, None, None, None),
> +    ("gzip", None, None, None, None),
> +    ("bzip2", None, None, None, None),
> +    ("perl", None, "5.8.7", (" (v", ") built for"), None),
> +    ("tar", None, None, None, None),
> +    ("cpio", None, None, None, None),
> +    ("unzip", None, None, None, None),
> +    ("rsync", None, None, None, None),
> +    ("file", None, None, None, "/usr/bin/file"),
> +    ("bc", None, None, None, None),
> +    ("find", "findutils", None, None, None)
> +]
> +source_fetching_tools_list = [
> +    ("wget", None, None, None, None)
> +]
> +
> +
> +def check_requirements() -> bool:
> +    """
> +    Manage the display and checking for the required tools, stop if any is
> missing
> +    """
> +    global console
> +    requirements_ok = True
> +    console.print("Mandatory packages:")
> +    with console.status("[blue]Looking for mandatory packages...") as status:
> +        console.print("  [i]Build tools:")
> +        requirements_ok = test_depencencies(build_tools_list) and
> requirements_ok
> +        console.print("  [i]Source fetching tools:")
> +        requirements_ok = test_depencencies(source_fetching_tools_list) and
> requirements_ok
> +    if not requirements_ok:
> +        console.print("\nMandatory program are [u]missing[/u]")
> +        console.print("you must install them before retrying!")
> +    return requirements_ok
> +
> +############################################################################################
> +
> +
> +recommanded_dependencies_list = [
> +    ("python", None, "2.7", ("Python ", "\n"), None)
> +]
> +configuration_interface_dependencies_list = [
> +    ("ncursesw5-config", "ncurses5", None, None, None),  # menuconfig
> +    ("qtdiag-qt5", "qt5", None, None, None),  # xconfig
> +    ("glib-compile-schemas", "glib2", None, None, None),  # gconfig
> +    ("gtk-launch", "gtk2", None, None, None),  # gconfig
> +    ("glade", "glade2", None, None, None)  # gconfig
> +]
> +opt_source_fetching_tools_list = [
> +    ("bzr", "bazaar", None, None, None),
> +    ("cvs", None, None, None, None),
> +    ("git", None, None, None, None),
> +    ("hg", "mercurial", None, None, None),
> +    ("scp", None, None, None, None),
> +    ("sftp", None, None, None, None),
> +    ("svn", "subversion", None, None, None)
> +]
> +java_related_packages_list = [
> +    ("javac", None, None, None, None),
> +    ("jar", None, None, None, None)
> +]
> +documentation_generation_tools_list = [
> +    ("asciidoc", None, "8.6.3", ("asciidoc ", "\n"), None),
> +    ("w3m", None, None, None, None),
> +    ("dblatex", None, None, None, None)  # Only for PDF manual
> +]
> +graph_generation_tools_list = [
> +    ("gvgen", "graphviz", None, None, None)
> +]
> +
> +
> +def check_optional_packages() -> bool:
> +    """
> +    Manage the display and checking for the optional tools, continue with a
> warning if any is missing
> +    """
> +    global console
> +    opt_ok = True
> +    console.print("Optional packages:")
> +    with console.status("[blue]Looking for optional packages...") as status:
> +        console.print("  [i]Recommended dependencies:")
> +        opt_ok = test_depencencies(recommanded_dependencies_list) and opt_ok
> +        console.print("  [i]Configuration interface dependencies:")
> +        opt_ok = test_depencencies(configuration_interface_dependencies_list)
> and opt_ok
> +        console.print("  [i]Source fetching tools:")
> +        opt_ok = test_depencencies(opt_source_fetching_tools_list) and opt_ok
> +        console.print("  [i]Java-related packages, if the Java Classpath needs
> to be built for the target system:")
> +        opt_ok = test_depencencies(java_related_packages_list) and opt_ok
> +        console.print("  [i]Documentation generation tools:")
> +        opt_ok = test_depencencies(documentation_generation_tools_list) and
> opt_ok
> +        console.print("  [i]Graph generation tools:")
> +        opt_ok = test_depencencies(graph_generation_tools_list) and opt_ok
> +        console.print(
> +            "    [gray39]:interrobang:  [b]python-matplotlib[/b] cannot be
> detected automaticly. Check your distribution's repositories or [i]pip[/i]")
> +    if not opt_ok:
> +        console.print("\n>>> Some optional program are [u]missing[/u], some
> functionalities may not compile or be incomplete")
> +    return opt_ok
> +
> +############################################################################################
> +
> +
> +# skeleton (4): host-skeleton && skeleton-init-common &&
> skeleton-init-(sysv|systemd|openrc|none) && skeleton
> +# toolcahin (2): toolchain && toolchain-buildroot
> +# gcc(+1): gcc become host-gcc-initial and host-gcc-final
> +# various (~4): host-makedevs && ifupdown-scripts && initscripts &&
> urandom-scripts
> +# +1 for the last task (finalizing host/target directory, sanitizing RPATH,
> generating root filesystems and image rootfs)
> +# BUT.... in some case this number isn't quit exact (especially the 'various'
> section)
> +extra_packages_to_count = 12
> +
> +
> +def load_all_tasks() -> None:
> +    """
> +    Load the external deps on witch we need to work for this run
> +    if an error occure, it's probably because the user didn't do any `make
> menuconfig`
> +    """
> +    global total_pkgs
> +    global console
> +    with console.status("[bold]Loading config, requested packages and
> dependencies[/bold] (did you run `[blue]make menuconfig[/blue]`?)",
> spinner="aesthetic") as status:
> +        external_deps = subprocess.run(["make", "external-deps"],
> stdout=subprocess.PIPE, check=True, text=True)
> +    raw_pkgs_list = external_deps.stdout.splitlines()
> +
> +    # Do some cleaning because all the patches files aren't packages
> +    regex = re.compile(r'^.*\.patch.*$')
> +    total_pkgs = [p for p in raw_pkgs_list if not regex.match(p)]
> +
> +    console.print("In total", (len(total_pkgs) + extra_packages_to_count),
> "packages will be processed for this buildroot")
> +
> +############################################################################################
> +
> +
> +def main() -> None:
> +    # Track the averall progress
> +    pkgs_progress = Progress(
> +        TimeElapsedColumn(),
> +        TextColumn("[blue]{task.description}"),
> +        BarColumn(complete_style='slate_blue1'),
> +        MofNCompleteColumn(),
> +        TaskProgressColumn()
> +    )
> +    # Track the current progress and stay after completion
> +    current_pkg_progress = Progress(
> +        TimeElapsedColumn(),
> +        TextColumn("{task.description}")
> +    )
> +    # Progress bars for single pkg steps (will be hidden when step is done)
> +    step_progress = Progress(
> +        TextColumn("     "),
> +        SpinnerColumn(),
> +        TextColumn("[bold purple]{task.fields[step]}")
> +    )
> +    # Progress bar for current pkg (progress in steps)
> +    pkg_steps_progress = Progress(
> +        TextColumn("[bold cyan]Progress for pkg {task.fields[name]}"),
> +        SpinnerColumn("simpleDotsScrolling"),
> +        TextColumn("({task.completed} of {task.total} steps done)"),
> +    )
> +    # Track any downloading
> +    downloader_progress = Progress(
> +        TextColumn("[bold]{task.fields[filename]}", justify="right"),
> +        BarColumn(bar_width=None, complete_style="dark_magenta",
> pulse_style="medium_purple"),
> +        "[progress.percentage]{task.percentage:>3.1f}%",
> +        "•",
> +        DownloadColumn(),
> +        "•",
> +        TransferSpeedColumn(),
> +        "•",
> +        TimeRemainingColumn()
> +    )
> +
> +    # group of progress bars;
> +    # some are always visible, others will disappear when progress is
> complete
> +    working_panel = Panel(Group(current_pkg_progress, step_progress,
> pkg_steps_progress, downloader_progress), box=box.SIMPLE_HEAD)
> +    progress_group = Group(
> +        working_panel,
> +        pkgs_progress
> +    )
> +
> +    pkgs_task_id = pkgs_progress.add_task("Packages", total=(len(total_pkgs) +
> extra_packages_to_count))
> +    cached = None
> +    overflow = None
> +
> +    # use own live instance as context manager with group of progress bars,
> +    # which allows for running multiple different progress bars in parallel,
> +    # and dynamically showing/hiding them
> +    with Live(progress_group):
> +        # Current task
> +        pkg = Task(current_pkg_progress=current_pkg_progress,
> step_progress=step_progress,
> +                   pkg_steps_progress=pkg_steps_progress,
> downloader=downloader_progress)
> +
> +        # Prepare the call to `make`
> +        modified_env = os.environ.copy()
> +        modified_env["LANG"] = "C"
> +        log.debug(f"Using modified encironment: {modified_env}")
> +        process = subprocess.Popen(
> +            'make',
> +            env=modified_env,
> +            stdout=subprocess.PIPE,
> +            stderr=subprocess.STDOUT,
> +            text=True,
> +            close_fds=True)
> +
> +        # Loop as long as the process is running
> +        while process.poll() is None:
> +            # Wait for something to read from stdout or stderr
> +            _, _, _ = select.select([process.stdout.fileno()], [], [])
> +            read = process.stdout.readline()
> +
> +            # Detect each step
> +            if read.__contains__("[7m>>>"):
> +                if handle_task_update(pkg, read):
> +                    pkgs_progress.advance(pkgs_task_id, advance=1)
> +            # Handles all WGET downloading steps
> +            elif re.search("Length: [0-9]+ \([0-9ioObBkKmMgGtT.,]+\)", read):
> +                handle_downloading(pkg, read, DownloadStep.LENGTH)
> +            elif re.search("wget -.*output' '.*'", read):
> +                handle_downloading(pkg, read, DownloadStep.OUTPUT)
> +            elif re.search("[0-9kKmMgGtTioObB.,]+ [. ]+ [0-9]+%", read):
> +                handle_downloading(pkg, read, DownloadStep.PROGRESS)
> +            elif re.search("[0-9-/pPaAmM ]+ \([0-9., kKmMgGtTbBoOiI\/]+s\) -
> .* saved", read):
> +                handle_downloading(pkg, read, DownloadStep.DONE)
> +            # Handles auto-detect for logging level
> +            elif re.search("[\] ]{1}(EXCEPTION|EMERGENCY|EMERG)[\]: ]{1}",
> read.upper()):
> +                log.exception(read)
> +            elif re.search("[\] ]{1}(CRITICAL|CRIT|FATAL|ALERT)[\]: ]{1}",
> read.upper()):
> +                log.critical(read)
> +            elif re.search("[\] ]{1}(ERROR|ERR)[\]: ]{1}", read.upper()):
> +                log.error(read)
> +            elif re.search("[\] ]{1}(WARNING|WARN)[\]: ]{1}", read.upper()):
> +                log.warning(read)
> +            elif re.search("[\] ]{1}(DEBUG|TRACE)[\]: ]{1}", read.upper()):
> +                log.debug(read)
> +            elif read:
> +                log.info(read)
> +
> +        # Clean the "current" progress bar
> +        del pkg
> +
> +        # We may need to fix this progress depending of the untrackable tasks
> that may be badly accounted for earlier
> +        pkgs_task = [t for t in pkgs_progress.tasks if t.id ==
> pkgs_task_id][0]
> +        if pkgs_task.completed < pkgs_task.total:
> +            cached = pkgs_task.total - pkgs_task.completed
> +            pkgs_progress.update(pkgs_task_id, completed=(pkgs_task.total),
> advance=1)
> +        elif pkgs_task.completed > pkgs_task.total:
> +            overflow = pkgs_task.completed - pkgs_task.total
> +            pkgs_progress.update(pkgs_task_id, total=pkgs_task.completed,
> completed=(pkgs_task.completed), advance=1)
> +        pkgs_progress.refresh()
> +    if cached is not None:
> +        console.print(f"[spring_green3][b]{cached}[/b] pkg(s) loaded from
> [i]cache")
> +    if overflow is not None:
> +        console.print(f"[deep_pink2 b]{overflow} unacounted pkg(s) added to
> our workload")
> +    console.print("Work is done :relieved_face:")
> +
> +############################################################################################
> +
> +
> +def handle_sigint(signum, frame) -> None:
> +    console.print("Interrupted by the user")
> +    quit()
> +
> +
> +signal.signal(signal.SIGINT, handle_sigint)
> +
> +############################################################################################
> +
> +if __name__ == '__main__':
> +    parser = argparse.ArgumentParser(
> +        prog="make_rich_wrapper.py",
> +        description=f"{__description__}",
> +        epilog=f"{__copyright__}, {__author__}, license: {__license__}"
> +    )
> +    parser.add_argument('-l', '--loglevel',
> +                        help='set the verbosity level for the logs printed on
> the screen',
> +                        type=str,
> +                        choices=["all", "debug", "info", "warn", "error",
> "critical"],
> +                        default="error")
> +    group = parser.add_mutually_exclusive_group()
> +    group.add_argument('-f', '--fast',
> +                       help='disable the requirements and optionnal check
> before running',
> +                       action='store_true')
> +    group.add_argument('--only-check',
> +                       help='only do the requirements and optionnal check and
> exit',
> +                       action='store_true')
> +    parser.add_argument('--check-optional',
> +                        help='enable checking for optional ressources and
> exit',
> +                        action='store_true')
> +    parser.add_argument('--version',
> +                        help='output version information and exit',
> +                        action='version',
> +                        version=f'%(prog)s {__version__} - {__status__}')
> +    parser.add_argument('--about',
> +                        help='output about information and exit',
> +                        action='store_true')
> +
> +    arguments = parser.parse_args()
> +
> +    if arguments.about:
> +        console = Console(highlight=False)
> +        console.print(f"{__copyright__}\nCreated by {__author__}, license:
> {__license__}\n\n{__description__}\n")
> +        if "release" == __status__:
> +            console.print(f"[green b]{__status__}")
> +        elif "test" == __status__:
> +            console.print(f"[cyan]{__status__}")
> +        elif "dev" == __status__:
> +            console.print(f"[purple i]{__status__}")
> +        else:
> +            console.print(f"[red u]{__status__}")
> +        console.print(f"Version: {__version__}\nMaintainer:
> {__maintainer__}\n\nCredits: {__credits__}")
> +        quit()
> +
> +    # Initialise the rich console for printing
> +    console = Console()
> +
> +    # Initialing the rich log handler
> +    logging.basicConfig(
> +        level=arguments.loglevel.upper().replace("ALL",
> "NOTSET").replace("WARN", "WARNING"),
> +        format="%(message)s",
> +        datefmt="[%X]",
> +        handlers=[RichHandler(rich_tracebacks=True, show_path=False)]
> +    )
> +    log = logging.getLogger("rich")
> +
> +    if not arguments.fast:
> +        ok = check_requirements()
> +        if not arguments.only_check:
> +            if ok:
> +                console.rule(style="green")
> +            else:
> +                quit()
> +
> +    if not arguments.fast and arguments.check_optional:
> +        ok = check_optional_packages()
> +        if not arguments.only_check:
> +            if ok:
> +                console.rule(style="blue")
> +            else:
> +                console.rule(style="magenta")
> +
> +    if not arguments.only_check:
> +        load_all_tasks()
> +        if not arguments.fast and not Confirm.ask("Proceed? (it may take some
> times)", console=console, default="y"):
> +            quit()
> +        console.print("\n\n")
> +        main()
> -- 
> 2.39.0
> 
> ----
> De: raphael.slagmolen@mailfence.com
> À: buildroot@buildroot.org, raphael.slagmolen@mailfence.com
> 9 janv. 2023 15:57:48
> 
> ----
> De: raphael.slagmolen@mailfence.com
> À: buildroot@buildroot.org
> 9 janv. 2023 16:22:52



> _______________________________________________
> buildroot mailing list
> buildroot@buildroot.org
> https://lists.buildroot.org/mailman/listinfo/buildroot
diff mbox series

Patch

diff --git a/DEVELOPERS b/DEVELOPERS
index 86e3f0e7b1..0c78d2f3a1 100644
--- a/DEVELOPERS
+++ b/DEVELOPERS
@@ -3159,3 +3159,6 @@  F:	package/quazip/
 F:	package/shapelib/
 F:	package/simple-mail/
 F:	package/tinc/
+
+N:	Raphaël Slagmolen (Tutul) <raphael.slagmolen@mailfence.com>
+F:	utils/make_rich_wrapper.py
diff --git a/utils/make_rich_wrapper.py b/utils/make_rich_wrapper.py
new file mode 100755
index 0000000000..152b50be65
--- /dev/null
+++ b/utils/make_rich_wrapper.py
@@ -0,0 +1,660 @@ 
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+
+'''
+    Date created: 2023/01/07
+    Date last modified: 2023/01/09
+    Python Version: 3.10+
+    Requirement:
+        packaging==21+
+        rich==13+
+'''
+
+__author__ = 'Tutul (https://gitlab.com/Tutul)'
+__copyright__ = 'Copyright 2023, make_rich_wrapper.py for Buildroot'
+__license__ = 'MIT'
+__credits__ = ['Tutul']
+__version__ = '1.0.0'
+__status__ = 'release'  # release > test > dev
+__maintainer__ = 'Tutul'
+__description__ = 'simple python script that act as a wrapper around `make`
for Buildroot using the Rich module'
+
+import argparse
+import enum
+import logging
+import os
+import re
+import select
+import shutil
+import signal
+import subprocess
+from packaging import version
+from rich import box
+from rich.console import Console, Group
+from rich.live import Live
+from rich.logging import RichHandler
+from rich.panel import Panel
+from rich.prompt import Confirm
+from rich.progress import (
+    BarColumn,
+    DownloadColumn,
+    MofNCompleteColumn,
+    Progress, SpinnerColumn,
+    Task,
+    TaskProgressColumn,
+    TextColumn,
+    TimeElapsedColumn,
+    TimeRemainingColumn,
+    TransferSpeedColumn
+)
+from rich.table import Table
+
+############################################################################################
+
+
+class DownloadStep(enum.Enum):
+    """
+    Help to know wich downloading information is present
+    """
+    LENGTH = 0
+    OUTPUT = 1
+    PROGRESS = 2
+    DONE = 3
+
+############################################################################################
+
+
+class State(enum.Enum):
+    """
+    Buildroot Makefile's steps
+    """
+    READY = 0
+    DOWNLOADING = 1
+    EXTRACTING = 2
+    PATCHING = 3
+    CONFIGURING = 4
+    BUILDING = 5
+    INSTALLING = 6
+    DONE = 7
+
+    @staticmethod
+    def get_from_message(message) -> str:
+        if message.__contains__("Downloading"):
+            return State.DOWNLOADING
+        elif message.__contains__("Extracting"):
+            return State.EXTRACTING
+        elif message.__contains__("Patching"):
+            return State.PATCHING
+        elif message.__contains__("Configuring"):
+            return State.CONFIGURING
+        elif message.__contains__("Building"):
+            return State.BUILDING
+        elif message.__contains__("Installing"):
+            return State.INSTALLING
+        else:
+            return None
+
+############################################################################################
+
+
+class Task():
+    """
+    Keep track of the current step
+    """
+
+    def __init__(self, current_pkg_progress, step_progress,
pkg_steps_progress, downloader):
+        self.current_pkg_progress = current_pkg_progress
+        self.step_progress = step_progress
+        self.pkg_steps_progress = pkg_steps_progress
+        self.downloader = downloader
+
+        self.pkg_task_id = None
+        self.step_task_id = None
+        self.pkg_step_task_id = None
+        self.downloading_task_id = None
+        self.state = None
+        self.name = None
+
+    def __del__(self):
+        if self.pkg_task_id is not None:
+            self.current_pkg_progress.stop_task(self.pkg_task_id)
+            self.pkg_task_id = None
+        if self.step_task_id is not None:
+            self.step_progress.stop_task(self.step_task_id)
+            self.step_progress.update(self.step_task_id, visible=False)
+            self.step_task_id = None
+        if self.pkg_step_task_id is not None:
+            self.pkg_steps_progress.stop_task(self.pkg_step_task_id)
+            self.pkg_steps_progress.update(self.pkg_step_task_id,
visible=False)
+            self.pkg_step_task_id = None
+        if self.downloading_task_id is not None:
+            self.downloader.stop_task(self.downloading_task_id)
+            self.downloader.update(self.downloading_task_id, visible=False)
+            self.downloading_task_id = None
+        self.name = None
+        self.sate = State.DONE
+
+    def update(self, new_state, new_step=None) -> None:
+        self.state = new_state
+
+        if self.downloading_task_id is not None and State.DOWNLOADING !=
self.state:
+            self.downloader.stop_task(self.downloading_task_id)
+            self.downloader.update(self.downloading_task_id, visible=False)
+            self.downloading_task_id = None
+
+        if self.step_task_id is not None:
+            self.step_progress.stop_task(self.step_task_id)
+            self.step_progress.update(self.step_task_id, visible=False)
+        if new_step is not None:
+            self.step_task_id = self.step_progress.add_task("", step=new_step,
name=self.name)
+        else:
+            self.step_task_id = None
+        if self.pkg_step_task_id is not None:
+            self.pkg_steps_progress.update(self.pkg_step_task_id,
completed=self.state.value)
+
+    def next(self, new_name, new_state, new_step=None) -> None:
+        if self.pkg_task_id is not None:
+            self.current_pkg_progress.stop_task(self.pkg_task_id)
+            self.current_pkg_progress.update(self.pkg_task_id,
description="[bold green]%s done!" % self.name)
+
+        if self.step_task_id is not None:
+            self.step_progress.stop_task(self.step_task_id)
+            self.step_progress.update(self.step_task_id, visible=False)
+
+        if self.pkg_step_task_id is not None:
+            self.pkg_steps_progress.stop_task(self.pkg_step_task_id)
+            self.pkg_steps_progress.update(self.pkg_step_task_id,
visible=False)
+            self.pkg_step_task_id = None
+
+        if self.downloading_task_id is not None:
+            self.downloader.stop_task(self.downloading_task_id)
+            self.downloader.update(self.downloading_task_id, visible=False)
+            self.downloading_task_id = None
+
+        self.name = new_name
+        self.state = new_state
+
+        if self.state is not None:
+            self.step_task_id = self.step_progress.add_task("", step=new_step,
name=self.name)
+        if "Finishing" == self.name:
+            self.pkg_task_id = self.current_pkg_progress.add_task("%s" %
self.name)
+        else:
+            self.pkg_step_task_id = self.pkg_steps_progress.add_task("",
total=State.DONE.value, name=self.name)
+            self.pkg_task_id = self.current_pkg_progress.add_task("Working on
%s" % self.name)
+
+############################################################################################
+
+
+def handle_task_update(current, message) -> bool:
+    """
+    Handle step update and detect task switching
+    """
+    finished = False
+    message_array = message.split(">>> ")[1].split(" ")
+
+    name = "[b]" + message_array[0] + "[/b]"
+    if len(message_array) >= 3:
+        name = name + " " + message_array[1]
+        step = message_array[2:]
+    else:
+        step = message_array[1:]
+    if message.__contains__(">>>   "):
+        name = "Finishing"  # Reset for the last few tasks
+
+    step = ' '.join(str(x) for x in step)
+    state = State.get_from_message(step)
+
+    if current.name is None:
+        current.next(name, state, step)
+    elif name != current.name:
+        finished = True
+        current.update(State.DONE)
+        if state is None:
+            state = State.READY
+        current.next(name, state, step)
+    elif state is None:
+        state = current.state
+
+    current.update(state, step)
+    return finished
+
+############################################################################################
+
+
+def handle_downloading(current, message, step) -> None:
+    """
+    Handle all that is about downloading (wget only for now)
+    """
+
+    log.debug(message)
+
+    if current.downloading_task_id is None:
+        current.downloading_task_id = current.downloader.add_task("",
total=None, filename="?")
+
+    if DownloadStep.LENGTH == step:
+        current.downloader.update(current.downloading_task_id,
total=int(message.split("Length: ")[1].split(" (")[0]))
+    elif DownloadStep.OUTPUT == step:
+        current.downloader.update(current.downloading_task_id,
filename=message.split("output' '")[1])
+    elif DownloadStep.PROGRESS == step:
+        if message.__contains__(" ."):
+            downloaded_size = message.split(" .")[0]
+        else:
+            downloaded_size = message.split("                                 
                     ")[0]
+        downloaded_size.strip().upper().replace("O", "").replace("B",
"").replace("I", "")
+        if downloaded_size.__contains__("K"):
+            downloaded_size = int(downloaded_size.replace("K", "")) * 1000
+        elif downloaded_size.__contains__("M"):
+            downloaded_size = int(downloaded_size.replace("M", "")) * 1000000
+        elif downloaded_size.__contains__("G"):
+            downloaded_size = int(downloaded_size.replace("G", "")) *
1000000000
+        elif downloaded_size.__contains__("T"):
+            downloaded_size = int(downloaded_size.replace("T", "")) *
1000000000000
+        current.downloader.update(current.downloading_task_id,
completed=int(downloaded_size))
+    elif DownloadStep.DONE == step:
+        current.downloader.stop_task(current.downloading_task_id)
+        current.downloader.update(current.downloading_task_id, visible=False)
+        current.downloading_task_id = None
+
+############################################################################################
+
+
+def which(exec_name, package=None, desired_version=None,
version_tuple_catcher=None, path=None) -> bool:
+    """
+    Used to find if a binary is present (and usable) on the system
+    Can also check for minimal version (as long as the binary use --version)
+    And may also check if the path is the required one
+    """
+    global console
+    if package is None:
+        package = exec_name
+
+    p = shutil.which(exec_name)
+    if p is None:
+        console.print(f"    [red1]:cross_mark: [b]{package}[/b] is required
but cannot find it in the PATH")
+        return False
+    elif path is not None and path != p:
+        console.print(
+            f"    [orange_red1]:no_entry: [b]{package}[/b] is found but we
need it to be in a specific location: {p} --> {path}")
+        return False
+    elif desired_version is not None:
+        if version_tuple_catcher is not None:
+            local_version = subprocess.run([p, "--version"],
stdout=subprocess.PIPE, check=True,
+                                          
text=True).stdout.split(version_tuple_catcher[0])[1].split(version_tuple_catcher[1])[0]
+            if version.parse(local_version) < version.parse(desired_version):
+                console.print(
+                    f"    [orange1]:exclamation_mark: [b]{package}[/b] is
found but we require at least the version {desired_version} and you have the
version {local_version}")
+                return False
+        else:
+            console.print(
+                f"    [orange1]:exclamation_mark: [b]{package}[/b] is found
but we require at least the version {desired_version}")
+            return False
+    console.print(f"    [green4]:heavy_check_mark:  [b]{package}[/b] is found
and usable")
+    return True
+
+############################################################################################
+
+
+def test_depencencies(tools_list) -> bool:
+    """
+    Simply handle calling which() from a list of tools
+    """
+    is_all_ok = True
+    for tool in tools_list:
+        if not which(tool[0], tool[1], tool[2], tool[3], tool[4]):
+            is_all_ok = False
+    return is_all_ok
+
+############################################################################################
+
+
+# which(binary), package[binary], minimal version, version_tuple_catcher,
required path
+build_tools_list = [
+    ("which", None, None, None, None),
+    ("sed", None, None, None, None),
+    ("make", None, "3.81", ("GNU Make ", "\n"), None),
+    ("ld", "binutils", None, None, None),
+    ("diff", "diffutils", None, None, None),
+    ("gcc", None, "4.8", ("gcc (GCC) ", "\n"), None),
+    ("g++", None, "4.8", ("g++ (GCC) ", "\n"), None),
+    ("bash", None, None, None, None),
+    ("patch", None, None, None, None),
+    ("gzip", None, None, None, None),
+    ("bzip2", None, None, None, None),
+    ("perl", None, "5.8.7", (" (v", ") built for"), None),
+    ("tar", None, None, None, None),
+    ("cpio", None, None, None, None),
+    ("unzip", None, None, None, None),
+    ("rsync", None, None, None, None),
+    ("file", None, None, None, "/usr/bin/file"),
+    ("bc", None, None, None, None),
+    ("find", "findutils", None, None, None)
+]
+source_fetching_tools_list = [
+    ("wget", None, None, None, None)
+]
+
+
+def check_requirements() -> bool:
+    """
+    Manage the display and checking for the required tools, stop if any is
missing
+    """
+    global console
+    requirements_ok = True
+    console.print("Mandatory packages:")
+    with console.status("[blue]Looking for mandatory packages...") as status:
+        console.print("  [i]Build tools:")
+        requirements_ok = test_depencencies(build_tools_list) and
requirements_ok
+        console.print("  [i]Source fetching tools:")
+        requirements_ok = test_depencencies(source_fetching_tools_list) and
requirements_ok
+    if not requirements_ok:
+        console.print("\nMandatory program are [u]missing[/u]")
+        console.print("you must install them before retrying!")
+    return requirements_ok
+
+############################################################################################
+
+
+recommanded_dependencies_list = [
+    ("python", None, "2.7", ("Python ", "\n"), None)
+]
+configuration_interface_dependencies_list = [
+    ("ncursesw5-config", "ncurses5", None, None, None),  # menuconfig
+    ("qtdiag-qt5", "qt5", None, None, None),  # xconfig
+    ("glib-compile-schemas", "glib2", None, None, None),  # gconfig
+    ("gtk-launch", "gtk2", None, None, None),  # gconfig
+    ("glade", "glade2", None, None, None)  # gconfig
+]
+opt_source_fetching_tools_list = [
+    ("bzr", "bazaar", None, None, None),
+    ("cvs", None, None, None, None),
+    ("git", None, None, None, None),
+    ("hg", "mercurial", None, None, None),
+    ("scp", None, None, None, None),
+    ("sftp", None, None, None, None),
+    ("svn", "subversion", None, None, None)
+]
+java_related_packages_list = [
+    ("javac", None, None, None, None),
+    ("jar", None, None, None, None)
+]
+documentation_generation_tools_list = [
+    ("asciidoc", None, "8.6.3", ("asciidoc ", "\n"), None),
+    ("w3m", None, None, None, None),
+    ("dblatex", None, None, None, None)  # Only for PDF manual
+]
+graph_generation_tools_list = [
+    ("gvgen", "graphviz", None, None, None)
+]
+
+
+def check_optional_packages() -> bool:
+    """
+    Manage the display and checking for the optional tools, continue with a
warning if any is missing
+    """
+    global console
+    opt_ok = True
+    console.print("Optional packages:")
+    with console.status("[blue]Looking for optional packages...") as status:
+        console.print("  [i]Recommended dependencies:")
+        opt_ok = test_depencencies(recommanded_dependencies_list) and opt_ok
+        console.print("  [i]Configuration interface dependencies:")
+        opt_ok = test_depencencies(configuration_interface_dependencies_list)
and opt_ok
+        console.print("  [i]Source fetching tools:")
+        opt_ok = test_depencencies(opt_source_fetching_tools_list) and opt_ok
+        console.print("  [i]Java-related packages, if the Java Classpath needs
to be built for the target system:")
+        opt_ok = test_depencencies(java_related_packages_list) and opt_ok
+        console.print("  [i]Documentation generation tools:")
+        opt_ok = test_depencencies(documentation_generation_tools_list) and
opt_ok
+        console.print("  [i]Graph generation tools:")
+        opt_ok = test_depencencies(graph_generation_tools_list) and opt_ok
+        console.print(
+            "    [gray39]:interrobang:  [b]python-matplotlib[/b] cannot be
detected automaticly. Check your distribution's repositories or [i]pip[/i]")
+    if not opt_ok:
+        console.print("\n>>> Some optional program are [u]missing[/u], some
functionalities may not compile or be incomplete")
+    return opt_ok
+
+############################################################################################
+
+
+# skeleton (4): host-skeleton && skeleton-init-common &&
skeleton-init-(sysv|systemd|openrc|none) && skeleton
+# toolcahin (2): toolchain && toolchain-buildroot
+# gcc(+1): gcc become host-gcc-initial and host-gcc-final
+# various (~4): host-makedevs && ifupdown-scripts && initscripts &&
urandom-scripts
+# +1 for the last task (finalizing host/target directory, sanitizing RPATH,
generating root filesystems and image rootfs)
+# BUT.... in some case this number isn't quit exact (especially the 'various'
section)
+extra_packages_to_count = 12
+
+
+def load_all_tasks() -> None:
+    """
+    Load the external deps on witch we need to work for this run
+    if an error occure, it's probably because the user didn't do any `make
menuconfig`
+    """
+    global total_pkgs
+    global console
+    with console.status("[bold]Loading config, requested packages and
dependencies[/bold] (did you run `[blue]make menuconfig[/blue]`?)",
spinner="aesthetic") as status:
+        external_deps = subprocess.run(["make", "external-deps"],
stdout=subprocess.PIPE, check=True, text=True)
+    raw_pkgs_list = external_deps.stdout.splitlines()
+
+    # Do some cleaning because all the patches files aren't packages
+    regex = re.compile(r'^.*\.patch.*$')
+    total_pkgs = [p for p in raw_pkgs_list if not regex.match(p)]
+
+    console.print("In total", (len(total_pkgs) + extra_packages_to_count),
"packages will be processed for this buildroot")
+
+############################################################################################
+
+
+def main() -> None:
+    # Track the averall progress
+    pkgs_progress = Progress(
+        TimeElapsedColumn(),
+        TextColumn("[blue]{task.description}"),
+        BarColumn(complete_style='slate_blue1'),
+        MofNCompleteColumn(),
+        TaskProgressColumn()
+    )
+    # Track the current progress and stay after completion
+    current_pkg_progress = Progress(
+        TimeElapsedColumn(),
+        TextColumn("{task.description}")
+    )
+    # Progress bars for single pkg steps (will be hidden when step is done)
+    step_progress = Progress(
+        TextColumn("     "),
+        SpinnerColumn(),
+        TextColumn("[bold purple]{task.fields[step]}")
+    )
+    # Progress bar for current pkg (progress in steps)
+    pkg_steps_progress = Progress(
+        TextColumn("[bold cyan]Progress for pkg {task.fields[name]}"),
+        SpinnerColumn("simpleDotsScrolling"),
+        TextColumn("({task.completed} of {task.total} steps done)"),
+    )
+    # Track any downloading
+    downloader_progress = Progress(
+        TextColumn("[bold]{task.fields[filename]}", justify="right"),
+        BarColumn(bar_width=None, complete_style="dark_magenta",
pulse_style="medium_purple"),
+        "[progress.percentage]{task.percentage:>3.1f}%",
+        "•",
+        DownloadColumn(),
+        "•",
+        TransferSpeedColumn(),
+        "•",
+        TimeRemainingColumn()
+    )
+
+    # group of progress bars;
+    # some are always visible, others will disappear when progress is
complete
+    working_panel = Panel(Group(current_pkg_progress, step_progress,
pkg_steps_progress, downloader_progress), box=box.SIMPLE_HEAD)
+    progress_group = Group(
+        working_panel,
+        pkgs_progress
+    )
+
+    pkgs_task_id = pkgs_progress.add_task("Packages", total=(len(total_pkgs) +
extra_packages_to_count))
+    cached = None
+    overflow = None
+
+    # use own live instance as context manager with group of progress bars,
+    # which allows for running multiple different progress bars in parallel,
+    # and dynamically showing/hiding them
+    with Live(progress_group):
+        # Current task
+        pkg = Task(current_pkg_progress=current_pkg_progress,
step_progress=step_progress,
+                   pkg_steps_progress=pkg_steps_progress,
downloader=downloader_progress)
+
+        # Prepare the call to `make`
+        modified_env = os.environ.copy()
+        modified_env["LANG"] = "C"
+        log.debug(f"Using modified encironment: {modified_env}")
+        process = subprocess.Popen(
+            'make',
+            env=modified_env,
+            stdout=subprocess.PIPE,
+            stderr=subprocess.STDOUT,
+            text=True,
+            close_fds=True)
+
+        # Loop as long as the process is running
+        while process.poll() is None:
+            # Wait for something to read from stdout or stderr
+            _, _, _ = select.select([process.stdout.fileno()], [], [])
+            read = process.stdout.readline()
+
+            # Detect each step
+            if read.__contains__("[7m>>>"):
+                if handle_task_update(pkg, read):
+                    pkgs_progress.advance(pkgs_task_id, advance=1)
+            # Handles all WGET downloading steps
+            elif re.search("Length: [0-9]+ \([0-9ioObBkKmMgGtT.,]+\)", read):
+                handle_downloading(pkg, read, DownloadStep.LENGTH)
+            elif re.search("wget -.*output' '.*'", read):
+                handle_downloading(pkg, read, DownloadStep.OUTPUT)
+            elif re.search("[0-9kKmMgGtTioObB.,]+ [. ]+ [0-9]+%", read):
+                handle_downloading(pkg, read, DownloadStep.PROGRESS)
+            elif re.search("[0-9-/pPaAmM ]+ \([0-9., kKmMgGtTbBoOiI\/]+s\) -
.* saved", read):
+                handle_downloading(pkg, read, DownloadStep.DONE)
+            # Handles auto-detect for logging level
+            elif re.search("[\] ]{1}(EXCEPTION|EMERGENCY|EMERG)[\]: ]{1}",
read.upper()):
+                log.exception(read)
+            elif re.search("[\] ]{1}(CRITICAL|CRIT|FATAL|ALERT)[\]: ]{1}",
read.upper()):
+                log.critical(read)
+            elif re.search("[\] ]{1}(ERROR|ERR)[\]: ]{1}", read.upper()):
+                log.error(read)
+            elif re.search("[\] ]{1}(WARNING|WARN)[\]: ]{1}", read.upper()):
+                log.warning(read)
+            elif re.search("[\] ]{1}(DEBUG|TRACE)[\]: ]{1}", read.upper()):
+                log.debug(read)
+            elif read:
+                log.info(read)
+
+        # Clean the "current" progress bar
+        del pkg
+
+        # We may need to fix this progress depending of the untrackable tasks
that may be badly accounted for earlier
+        pkgs_task = [t for t in pkgs_progress.tasks if t.id ==
pkgs_task_id][0]
+        if pkgs_task.completed < pkgs_task.total:
+            cached = pkgs_task.total - pkgs_task.completed
+            pkgs_progress.update(pkgs_task_id, completed=(pkgs_task.total),
advance=1)
+        elif pkgs_task.completed > pkgs_task.total:
+            overflow = pkgs_task.completed - pkgs_task.total
+            pkgs_progress.update(pkgs_task_id, total=pkgs_task.completed,
completed=(pkgs_task.completed), advance=1)
+        pkgs_progress.refresh()
+    if cached is not None:
+        console.print(f"[spring_green3][b]{cached}[/b] pkg(s) loaded from
[i]cache")
+    if overflow is not None:
+        console.print(f"[deep_pink2 b]{overflow} unacounted pkg(s) added to
our workload")
+    console.print("Work is done :relieved_face:")
+
+############################################################################################
+
+
+def handle_sigint(signum, frame) -> None:
+    console.print("Interrupted by the user")
+    quit()
+
+
+signal.signal(signal.SIGINT, handle_sigint)
+
+############################################################################################
+
+if __name__ == '__main__':
+    parser = argparse.ArgumentParser(
+        prog="make_rich_wrapper.py",
+        description=f"{__description__}",
+        epilog=f"{__copyright__}, {__author__}, license: {__license__}"
+    )
+    parser.add_argument('-l', '--loglevel',
+                        help='set the verbosity level for the logs printed on
the screen',
+                        type=str,
+                        choices=["all", "debug", "info", "warn", "error",
"critical"],
+                        default="error")
+    group = parser.add_mutually_exclusive_group()
+    group.add_argument('-f', '--fast',
+                       help='disable the requirements and optionnal check