Files
@ c91f5f36fb2b
Branch filter:
Location: kallithea/docs/usage/statistics.rst - annotation
c91f5f36fb2b
1.2 KiB
text/prs.fallenstein.rst
api: fix 'kallithea-api --save-config'
Commit eca0cb56a822 attempted to fix a type inconsistency, which caused
failure in the 'kallithea-api' tool when using '--save-config', but this
unfortunately did not fix the problem completely.
Following error still appeared:
Traceback (most recent call last):
File ".../bin/kallithea-api", line 33, in <module>
sys.exit(load_entry_point('Kallithea', 'console_scripts', 'kallithea-api')())
File ".../bin/kallithea_api.py", line 84, in main
'apihost': args.apihost})
File ".../bin/base.py", line 104, in __init__
self.make_config(config)
File ".../bin/base.py", line 132, in make_config
ext_json.dump(config, f, indent=4)
File "/usr/lib/python3.7/json/__init__.py", line 180, in dump
fp.write(chunk)
TypeError: a bytes-like object is required, not 'str'
The json module documentation says:
https://docs.python.org/3.7/library/json.html#basic-usage
"The json module always produces str objects, not bytes objects. Therefore,
fp.write() must support str input."
Therefore, instead of opening the file in binary mode and writing bytes,
open it in text mode and write strings.
For symmetry reasons, we make the same change when _loading_ the config
file, but this code worked regardless.
Commit eca0cb56a822 attempted to fix a type inconsistency, which caused
failure in the 'kallithea-api' tool when using '--save-config', but this
unfortunately did not fix the problem completely.
Following error still appeared:
Traceback (most recent call last):
File ".../bin/kallithea-api", line 33, in <module>
sys.exit(load_entry_point('Kallithea', 'console_scripts', 'kallithea-api')())
File ".../bin/kallithea_api.py", line 84, in main
'apihost': args.apihost})
File ".../bin/base.py", line 104, in __init__
self.make_config(config)
File ".../bin/base.py", line 132, in make_config
ext_json.dump(config, f, indent=4)
File "/usr/lib/python3.7/json/__init__.py", line 180, in dump
fp.write(chunk)
TypeError: a bytes-like object is required, not 'str'
The json module documentation says:
https://docs.python.org/3.7/library/json.html#basic-usage
"The json module always produces str objects, not bytes objects. Therefore,
fp.write() must support str input."
Therefore, instead of opening the file in binary mode and writing bytes,
open it in text mode and write strings.
For symmetry reasons, we make the same change when _loading_ the config
file, but this code worked regardless.
bbd499c7b55e bbd499c7b55e ac7e43325817 ac7e43325817 ac7e43325817 bbd499c7b55e 5ae8e644aa88 ac7e43325817 ac7e43325817 ac7e43325817 bbd499c7b55e ac7e43325817 ac7e43325817 ac7e43325817 ac7e43325817 ac7e43325817 bbd499c7b55e ac7e43325817 ac7e43325817 ac7e43325817 bbd499c7b55e ac7e43325817 ac7e43325817 ac7e43325817 bbd499c7b55e ac7e43325817 bbd499c7b55e ac7e43325817 ac7e43325817 ac7e43325817 ac7e43325817 | .. _statistics:
=====================
Repository statistics
=====================
Kallithea has a *repository statistics* feature, disabled by default. When
enabled, the amount of commits per committer is visualized in a timeline. This
feature can be enabled using the ``Enable statistics`` checkbox on the
repository ``Settings`` page.
The statistics system makes heavy demands on the server resources, so
in order to keep a balance between usability and performance, statistics are
cached inside the database and gathered incrementally.
When Celery is disabled:
On each first visit to the summary page a set of 250 commits are parsed and
added to the statistics cache. This incremental gathering also happens on each
visit to the statistics page, until all commits are fetched.
Statistics are kept cached until additional commits are added to the
repository. In such a case Kallithea will only fetch the new commits when
updating its statistics cache.
When Celery is enabled:
On the first visit to the summary page, Kallithea will create tasks that will
execute on Celery workers. These tasks will gather all of the statistics until
all commits are parsed. Each task parses 250 commits, then launches a new
task.
|