Module ciocore.loggeria¶
Functions¶
setup_conductor_logging¶
Source
def setup_conductor_logging(
logger_level=DEFAULT_LEVEL_LOGGER,
console_level=None,
console_formatter=FORMATTER_LIGHT,
log_filepath=None,
file_level=None,
file_formatter=FORMATTER_VERBOSE,
multiproc=False,
disable_console_logging = False,
propagate = True
):
# Get the top/parent conductor logger
logger = get_conductor_logger()
# In some DCC's (ex: Maya) we want to use the same handlers as root. In other DCC's (ex: Nuke)
# we don't.
logger.propagate = propagate
if logger_level:
assert logger_level in LEVEL_MAP.values(), "Not a valid log level: %s" % logger_level
# Set the main log level. Note that if this is super restrive, you won't see handler
# messages, regardless of the handlers' log level
logger.setLevel(logger_level)
if not disable_console_logging:
# Handle debug, info, warning records only. STDOUT
console_handler_out = logging.StreamHandler(sys.__stdout__)
console_handler_out.setLevel(logging.DEBUG)
console_handler_out.addFilter(LogLevelFilter())
logger.addHandler(console_handler_out)
# Handle error and critical records only. STDERR
console_handler_err = logging.StreamHandler(sys.__stderr__)
console_handler_err.setLevel(logging.ERROR)
logger.addHandler(console_handler_err)
# create formatter and apply it to the handlers
if console_formatter:
console_handler_out.setFormatter(console_formatter)
console_handler_err.setFormatter(console_formatter)
# Create a file handler if a filepath was given
if log_filepath:
if file_level:
assert file_level in LEVEL_MAP.values(), "Not a valid log level: %s" % file_level
# Rotating file handler. Rotates every day (24 hours). Stores 7 days at a time.
file_handler = create_file_handler(
log_filepath, level=file_level, formatter=file_formatter, multiproc=multiproc
)
logger.addHandler(file_handler)
setup_conductor_logging(logger_level=20, console_level=None, console_formatter=<logging.Formatter object>, log_filepath=None, file_level=None, file_formatter=<logging.Formatter object>, multiproc=False, disable_console_logging=False, propagate=True)
-
The is convenience function to help set up logging.
THIS SHOULD ONLY BE CALLED ONCE within an execution environment.
This function does the following:
- Creates/retrieves the logger object for the "conductor" package
- Sets that logger's log level to the given logger_level (optional)
- Creates two console handlers (stderr and stdout) and attaches them to the logger object. 3a. Optionally sets that console handler's formatter to the the given console_formatter
- Optionally creates a file handler (if a log filepath is given) 4a. Optionally sets that file handler's log level to the given file_level 4b. Optionally sets that file handler's formatter to the the given file_formatter
console_formatter & file_formatter: Formatters are the formatter objects. Not just a string such as "DEBUG". This is because you may need more than just a string to define a formatter object.
multiproc: bool. If True, a custom file handler will be used that handles multiprocess logging correctly. This file handler creates an additional Process.
create_file_handler¶
Source
def create_file_handler(filepath, level=None, formatter=None, multiproc=False):
when = "h" # rotate unit is "h" (hours)
interval = 24 # rotate every 24 units (24 hours)
backupCount = 7 # Retain up to 7 log files (7 days of log files)
log_dirpath = os.path.dirname(filepath)
if not os.path.exists(log_dirpath):
os.makedirs(log_dirpath)
if multiproc:
# Use custom rotating file handler that handles multiprocessing properly
handler = MPFileHandler(filepath, when=when, interval=interval, backupCount=backupCount)
else:
# Rotating file handler. Rotates every day (24 hours). Stores 7 days at a time.
handler = logging.handlers.TimedRotatingFileHandler(
filepath, when=when, interval=interval, backupCount=backupCount
)
if formatter:
handler.setFormatter(formatter)
if level:
handler.setLevel(level)
return handler
create_file_handler(filepath, level=None, formatter=None, multiproc=False)
-
Create a file handler object for the given filepath.
This is a ROTATING file handler, which rotates every day (24 hours) and stores up to 7 days of logs at a time (equaling up to as many as 7 log files at a given time.
set_conductor_log_level¶
Source
def set_conductor_log_level(log_level):
assert log_level in LEVEL_MAP.keys(), "Invalid log_level: %s" % log_level
logger = get_conductor_logger()
logger.setLevel(log_level)
logger.info("Changed log level to %s", log_level)
set_conductor_log_level(log_level)
- Set the "conductor" package's logger to the given log level.
get_conductor_logger¶
Source
def get_conductor_logger():
return logging.getLogger(CONDUCTOR_LOGGER_NAME)
get_conductor_logger()
- Return the "conductor" package's logger object.
get_default_log_dir¶
Source
def get_default_log_dir(platform=None):
#platform = platform or sys.platform
return os.path.expanduser(os.path.join("~", ".conductor", "logs", "conductor.log"))
get_default_log_dir(platform=None)
Classes¶
LogLevelFilter¶
Source
def __init__(self, name=""):
super(LogLevelFilter, self).__init__(name)
LogLevelFilter(name='')
-
Filter log messages based on level.
By default, the logger sends everything to stderr. This is a problem in Clarisse at least, because stderr prints RED in the log panel and pops up a floating window to display what it thinks is an error. Customers get worried. To alleviate this we make 2 handlers, one for stdout and one for stderr, and we route the appropriate log records to each. This filter only allows warning, info, and debug records and will be used by the handler that logs to stdout. The stderr handler doesn't need a filter as it simply has it's level set for error and critical records only.
Initialize a filter.
Initialize with the name of the logger which, together with its children, will have its events allowed through the filter. If no name is specified, allow every event.
Ancestors (in MRO)¶
- logging.Filter
Methods¶
filter¶
Source
def filter(self, record): return int(record.levelno < logging.ERROR)
filter(self, record)
-
Determine if the specified record is to be logged.
Returns True if the record should be logged, or False otherwise. If deemed appropriate, the record may be modified in-place.
MPFileHandler¶
Source
def __init__(
self, filename, when="h", interval=1, backupCount=0, encoding=None, delay=0, utc=0
):
logging.Handler.__init__(self)
self._handler = TimedRotatingFileHandler(
filename,
when=when,
interval=interval,
backupCount=backupCount,
encoding=encoding,
delay=delay,
utc=utc,
)
self.queue = multiprocessing.Queue()
t = threading.Thread(target=self.receive)
t.daemon = True
t.start()
MPFileHandler(filename, when='h', interval=1, backupCount=0, encoding=None, delay=0, utc=0)
-
Multiprocess-safe Rotating File Handler.
Copied from: http://stackoverflow.com/questions/641420/how-should-i-log-while-using-multiprocessing-in-python
See TimedRotatingFileHandler for arg docs.
Ancestors (in MRO)¶
- logging.Handler
- logging.Filterer
Methods¶
setFormatter¶
Source
def setFormatter(self, fmt): logging.Handler.setFormatter(self, fmt) self._handler.setFormatter(fmt)
setFormatter(self, fmt)
- Set the formatter for this handler.
receive¶
Source
def receive(self): while True: try: record = self.queue.get() self._handler.emit(record) except (KeyboardInterrupt, SystemExit): raise except EOFError: break except BaseException: traceback.print_exc(file=sys.stderr)
receive(self)
:send¶
Source
def send(self, s): self.queue.put_nowait(s)
send(self, s)
:emit¶
Source
def emit(self, record): try: s = self._format_record(record) self.send(s) except (KeyboardInterrupt, SystemExit): raise except BaseException: self.handleError(record)
emit(self, record)
-
Do whatever it takes to actually log the specified logging record.
This version is intended to be implemented by subclasses and so raises a NotImplementedError.
close¶
Source
def close(self): self._handler.close() logging.Handler.close(self)
close(self)
-
Tidy up any resources used by the handler.
This version removes the handler from an internal map of handlers, _handlers, which is used for handler lookup by name. Subclasses should ensure that this gets called from overridden close() methods.
TableStr¶
Source
def __init__(self, data, column_names, title="", footer="", upper_headers=True):
self.data = data
self.column_names = column_names
self.title = title
self.footer = footer
self.uppper_headers = upper_headers
TableStr(data, column_names, title='', footer='', upper_headers=True)
-
A class to help log/print tables of data
######## DOWNLOAD HISTORY¶
COMPLETED AT DOWNLOAD ID JOB TASK SIZE ACTION DURATION THREAD FILEPATH 2016-01-16 01:12:46 5228833175240704 00208 010 137.51MB DL 0:00:57 Thread-12 /tmp/conductor_daemon_dl/04/cental/cental.010.exr 2016-01-16 01:12:42 6032237141164032 00208 004 145.48MB DL 0:02:24 Thread-2 /tmp/conductor_daemon_dl/04/cental/cental.004.exr 2016-01-16 01:12:40 5273802288136192 00208 012 140.86MB DL 0:02:02 Thread-16 /tmp/conductor_daemon_dl/04/cental/cental.012.exr
args: data: list of dicts. Each dict represents a row, where the key is the column name, and the value is the...value
column_names: list of str. The columns of data to show (and the order in which they are shown)
title: str. if provided, will be printed above the table
footer: str. if provided, will be printed below the table
upper_headers: bool. If True, will automatically uppercase the column header names
Descendants¶
- ciocore.downloader.HistoryTableStr
Class variables¶
header_modifiers¶
header_modifiers
:cell_modifiers¶
cell_modifiers
:column_spacer¶
column_spacer
:row_spacer¶
row_spacer
:Methods¶
make_table_str¶
Source
def make_table_str(self): column_strs = {} for column_name in self.column_names: column_strs[column_name] = self.make_column_strs(column_name, self.data) rows = [] for row_idx in range(len(self.data) + 1): # add 1 to account for header row row_data = [] for column_name in self.column_names: row_data.append(column_strs[column_name][row_idx]) rows.append(self.column_spacer.join(row_data)) rows.insert(0, self.get_title()) rows.append(self.get_footer()) return self.row_spacer.join(rows)
make_table_str(self)
-
Create and return a final table string that is suitable to print/log.
This is achieved by creating a list of items for each column in the table. Once all column lists have been created, they are then joined via a constant column space character(s) - self.column_spacer. The rows that are created from the columns are then prefixed with given title (self.title) and suffixed with the given footer (self.footer)
make_column_strs¶
Source
def make_column_strs(self, column_name, data): column_header = self.modify_header(column_name) column_cells = [] for row_dict in data: cell_data = row_dict.get(column_name, "") cell_data = self.modify_cell(column_name, cell_data) column_cells.append(str(cell_data)) column_data = [column_header] + column_cells column_width = len(max(column_data, key=len)) column_strs = [] for cell_str in column_data: column_strs.append(cell_str.ljust(column_width)) return column_strs
make_column_strs(self, column_name, data)
- Return a two dimensial list (list of lists), where the inner lists repersent a column of data.
modify_header¶
Source
def modify_header(self, column_name): header_modifier = self.header_modifiers.get(column_name) if header_modifier: column_name = header_modifier(column_name) return column_name.upper() if self.uppper_headers else column_name
modify_header(self, column_name)
-
Modify and return the given column name.
This provides an opportunity to adjust what the header should consist of.
modify_cell¶
Source
def modify_cell(self, column_name, cell_data): cell_modifier = self.cell_modifiers.get(column_name) if cell_modifier: cell_data = cell_modifier(cell_data) return cell_data
modify_cell(self, column_name, cell_data)
-
Modify and return the given cell data of the given column name.
This provides an opportunity to adjust what the header should consist of.
get_title¶
Source
def get_title(self): return self.title
get_title(self)
:get_footer¶
Source
def get_footer(self): return self.footer
get_footer(self)
: