iripau.logging module¶
Generic and stand-alone logging utilities that can also be used in conjunction
with subprocess to log the subprocesses’ stdout and stderr in
real-time, regardless of whether the subprocess raised and exception using
check=True, while maintaining the ability of capturing the subprocess’
output.
Example
Setup a logger for stdout, other one for stderr and another one to
log outside of the subprocesses. All of the loggers will write to the same
file:
import logging
import tempfile
output_file = tempfile.SpooledTemporaryFile(mode="w+t")
normal_log_format = "%(asctime).19s - %(threadName)12.12s: %(message)s"
output_log_format = "%(asctime).19s - %(threadName)12.12s: [%(name)s] %(message)s"
# Setup normal log handler
formatter = SimpleThreadNameFormatter(normal_log_format)
handler = logging.StreamHandler(output_file)
handler.setLevel(logging.DEBUG)
handler.setFormatter(formatter)
# Setup normal logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
# Setup stdout and stderr log handler
formatter = SimpleThreadNameFormatter(output_log_format)
handler = logging.StreamHandler(output_file)
handler.setLevel(logging.DEBUG)
handler.setFormatter(formatter)
# Setup stdout logger
stdout_logger = logging.getLogger("stdout")
stdout_logger.setLevel(logging.DEBUG)
stdout_logger.addHandler(handler)
# Setup stderr logger
stderr_logger = logging.getLogger("stderr")
stderr_logger.setLevel(logging.DEBUG)
stderr_logger.addHandler(handler)
Example
To use the stdout_logger and stderr_logger objects from the example
above in a subprocess using subprocess, the LoggerFile
objects can be created before spawning the processes, and closed at the end:
from iripau.subprocess import run
# Make sure to close the LoggerFiles
with (
LoggerFile(stdout_logger, logging.INFO) as stdout_logger_file,
LoggerFile(stderr_logger, logging.INFO) as stderr_logger_file
):
output = run(
["kubectl", "get", "pods"],
capture_output=True,
check=True,
prompt_tees=[stdout_logger_file],
stdout_tees=[stdout_logger_file],
stderr_tees=[stderr_logger_file]
)
# Output captured
print(output.stdout)
print(output.stderr)
print(output.returncode)
# Output logged
output_file.seek(0)
print(output_file.read())
Important
Closing the LoggerFile objects do not affect the output_file
created in the first example.
Tip
Passing a callable as a tee file in subprocess.Popen causes the
file to be created (the result of calling it) before spawning the process,
and closed when it finishes.
Example
To log the command, stdout and stderr of all of the proceeses spawned
using subprocess, a global configuration using the stdout_logger
and stderr_logger objects can be done:
from functools import partial
from iripau.subprocess import set_global_prompt_files
from iripau.subprocess import set_global_stdout_files
from iripau.subprocess import set_global_stderr_files
# Create callables that receive no arguments and return a LoggerFile
stdout_logger_file = partial(LoggerFle, stdout_logger, logging.INFO)
stderr_logger_file = partial(LoggerFle, stderr_logger, logging.INFO)
set_global_prompt_files(stdout_logger_file)
set_global_stdout_files(stdout_logger_file)
set_global_stderr_files(stderr_logger_file)
output = run(
["kubectl", "get", "pods"],
capture_output=True,
check=True
)
# Output captured
print(output.stdout)
print(output.stderr)
print(output.returncode)
# Outpu logged
output_file.seek(0)
print(output_file.read())
Tip
Disable the use of globally configured loggers by setting to False
add_global_stdout_tees, add_global_stderr_tees and/or
add_global_prompt_tees in subprocess.run() or
subprocess.Popen:
output = run(
["kubectl", "get", "pods"],
capture_output=True,
check=True,
add_global_stdout_tees=False,
add_global_stderr_tees=False,
add_global_prompt_tees=False
)
- class iripau.logging.LoggerFile(logger: Logger, level: int)[source]¶
Bases:
objectFile-like object that logs every line written to it. A thread is created to asynchronously consume the file. The thread is joined when the file is closed.
- Parameters:
logger – The object used to perform the logging.
level – The level used to perform the logging.
Example
Get a logger object and use it to create a
LoggerFile:import logging logger = logging.get_logger(__name__) logger_file = LoggerFle(logger, logging.DEBUG) # Equivalent to logger.debug("A line in a file") logger_file.write("A line in a file\n")
Example
A
LoggerFilecan be used in a subprocess:from subprocess import run # The stdout will be logged in real-time run(["apt", "install", "-y", "python3"], stdout=logger_file)
- static log(read_file, logger, level)[source]¶
Read the whole
read_fileline by line and log them using theloggerobject at the specified logginglevel.This is what will be running in a thread until
EOFis reached.- Parameters:
read_file (
IOBase) – A file opened in read mode.logger (
Logger) – The object used to perform the logging.level (
int) – The level used to perform the logging.
- static patch(instance, method_name, callback)[source]¶
Modify
instance.method_name()so thatcallback()gets called whenmethod_name()is called.callback()will be called first and then the originalinstance.method_name().It is used to close the file when the thread is joined.
- Parameters:
instance (
object) – An object whose method will be patched.method_name (
str) – The name of the method to patch.callback (
Callable[[],None]) – The function to patch with. It will be called without arguments and its return value will be ignored.
Example
Print a message before closing a file:
def print_dummy_message(): print("The file is about to be closed...") f = open("/tmp/file.txt") LoggerFle.patch(f, "close", print_dummy_message) # The message will be printed right before the file is closed f.close()
- class iripau.logging.SimpleThreadNameFormatter(fmt=None, datefmt=None, style='%', validate=True, *, defaults=None)[source]¶
Bases:
FormatterThe same
logging.FormatterbutthreadNameis just the first token after splitting it:record.threadName.split(maxsplit=1)[0].The constructor arguments are the same as the base class.
Example
Setup a logging handler to use
SimpleThreadNameFormatter:import sys import logging format = "%(threadName): %(message)s" formatter = SimpleThreadNameFormatter(format) handler = logging.StreamHandler(sys.stdout) handler.setFormatter(formatter)
- iripau.logging.group_log_lines(lines, thread_id_regex, main_thread_id='MainThread')[source]¶
For a log file containing entries from several threads, group the lines so that the lines coming from the same thread are contiguous, preserving the order within the group. Lines coming from the main thread will not be grouped.
- Parameters:
lines (
Iterable[str]) – The lines to group.thread_id_regex (
str) – The pattern to get the thread name from each line. It should have one capturing group, which will be taken as the thread name.main_thread_id (
str) – The name of the main thread.
- Yields:
str – The next line according to the group ordering.
- Raises:
ValueError – If a line does not match
thread_id_regex.
Example
Open a log file and print the grouped lines:
import sys # The thread name is at the beginning of each line, before ':' thread_id_regex = "(.+): .*" with open("/tmp/threaded.log") as log_file: sys.stdout.writelines(group_log_lines(log_file))