Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Max Series of Database Exceeded #52

Open
qxslew opened this issue Nov 22, 2020 · 0 comments
Open

Max Series of Database Exceeded #52

qxslew opened this issue Nov 22, 2020 · 0 comments

Comments

@qxslew
Copy link

qxslew commented Nov 22, 2020

Just shy of a day of running, the journaler begins to error out, showing 'restarting' as the status in docker. When I go to the logs for the container I find repeated errors that indicate the max series of database is exceeded.

INFO:pika.adapters.utils.connection_workflow:Pika version 1.1.0 connecting to ('172.18.0.6', 5672)
INFO:pika.adapters.utils.io_services_utils:Socket connected: <socket.socket fd=6, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('172.18.0.5', 38098), raddr=('172.18.0.6', 5672)>
INFO:pika.adapters.utils.connection_workflow:Streaming transport linked up: (<pika.adapters.utils.io_services_utils._AsyncPlaintextTransport object at 0x7f9bb372f730>, _StreamingProtocolShim: <SelectConnection PROTOCOL transport=<pika.adapters.utils.io_services_utils._AsyncPlaintextTransport object at 0x7f9bb372f730> params=>).
INFO:pika.adapters.utils.connection_workflow:AMQPConnector - reporting success: <SelectConnection OPEN transport=<pika.adapters.utils.io_services_utils._AsyncPlaintextTransport object at 0x7f9bb372f730> params=>
INFO:pika.adapters.utils.connection_workflow:AMQPConnectionWorkflow - reporting success: <SelectConnection OPEN transport=<pika.adapters.utils.io_services_utils._AsyncPlaintextTransport object at 0x7f9bb372f730> params=>
INFO:pika.adapters.blocking_connection:Connection workflow succeeded: <SelectConnection OPEN transport=<pika.adapters.utils.io_services_utils._AsyncPlaintextTransport object at 0x7f9bb372f730> params=>
INFO:sos-journaler:Connected to RabbitMQ
INFO:pika.adapters.blocking_connection:Created channel=1
Traceback (most recent call last):
File "/sos-journaler/main.py", line 40, in
main()
File "/sos-journaler/main.py", line 36, in main
channel.start_consuming()
File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 1866, in start_consuming
self._process_data_events(time_limit=None)
File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 2027, in _process_data_events
self.connection.process_data_events(time_limit=time_limit)
File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 834, in process_data_events
self._dispatch_channel_events()
File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 566, in _dispatch_channel_events
impl_channel._get_cookie()._dispatch_events()
File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 1493, in _dispatch_events
consumer_info.on_message_callback(self, evt.method,
File "/sos-journaler/sos_journaler/message_handling.py", line 19, in on_message
self._db.write_points([point])
File "/usr/local/lib/python3.9/site-packages/influxdb/client.py", line 594, in write_points
return self._write_points(points=points,
File "/usr/local/lib/python3.9/site-packages/influxdb/client.py", line 672, in _write_points
self.write(
File "/usr/local/lib/python3.9/site-packages/influxdb/client.py", line 404, in write
self.request(
File "/usr/local/lib/python3.9/site-packages/influxdb/client.py", line 369, in request
raise InfluxDBClientError(err_msg, response.status_code)
influxdb.exceptions.InfluxDBClientError: 400: {"error":"partial write: max-series-per-database limit exceeded: (1000000) dropped=1"}

A du -h of /var/lib/influxdb/data/fixm shows about 11GB of storage being used.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant