Skip to content

Commit

Permalink
huge modify XD
Browse files Browse the repository at this point in the history
  • Loading branch information
as535364 committed Jun 13, 2022
1 parent ad7dadd commit aa2088c
Show file tree
Hide file tree
Showing 12 changed files with 65 additions and 68 deletions.
2 changes: 0 additions & 2 deletions .envExample

This file was deleted.

7 changes: 2 additions & 5 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
# secret
.env
src/.env
src/news.txt

# VSCode
.vscode/

# JetBrains
.idea/

# config file
config.json
news.txt

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
Expand Down
6 changes: 3 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@ FROM python:3.8-alpine

WORKDIR /usr/src/app

COPY requirements.txt ./
COPY src/requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY . .
COPY ./src .

CMD [ "python", "./bot.py" ]
CMD [ "python", "-u", "./bot.py" ]
16 changes: 6 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,28 +2,24 @@

## For local development
1. Run `pip install --no-cache-dir -r requirements.txt` (there may be some useless packages such as packages for telegram bot)
2. Read the file crawlers/ntnucsiecrawler.py the return value of `get_update` method
2. Read the file crawlers/ntnucsie.py the return value of `get_update` method
3. Coding your own crawlers/yourschoolcrawler.py support `get_update` method
4. Use `python3 test.py` to test your crawler and don't forget to modify the source of the imported `Crawler` class !!!


## For deploy
(TODO: use docker-compose for convenience)
1.
```bash
git clone https://github.com/as535364/NTNU-CSIE-Notify.git
git clone https://github.com/as535364/CSIE-Notify
```
2. Modify `.env`.

2. In src directory `cp .envExample .env` and modify `.env`.

**Telegram chat id must be an integer.**

If there are multiple ids, use commas to separate them.

3.
```bash
docker build -t notify .
```
4.
```bash
docker run -d --name notify notify
```
docker-compose up -d
```
39 changes: 0 additions & 39 deletions bot.py

This file was deleted.

3 changes: 0 additions & 3 deletions requirements.txt

This file was deleted.

2 changes: 2 additions & 0 deletions src/.envExample
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
TOKEN=7122:botToken
CHAT_ID=100,200,300,-200
40 changes: 40 additions & 0 deletions src/bot.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
from dotenv import load_dotenv
import os
import time
from datetime import datetime, timezone, timedelta
import schedule

import telebot
from crawlers.nycucs import Crawler

# load bot settings
load_dotenv()
TOKEN = os.getenv('TOKEN')
CHAT_IDS = list(map(int, os.getenv('CHAT_ID').split(',')))
tb = telebot.TeleBot(TOKEN, parse_mode='HTML')


def send_news():
tz = timezone(timedelta(hours=+8))
print('Update Time:', datetime.now(tz).isoformat(timespec="seconds"))
news = Crawler().get_update()
news_lists = []
for new in news:
news_lists.append(f'{new["title"]}\n\n{new["link"]}')
text = '\n\n\n'.join(news_lists)

for chat_id in CHAT_IDS:
if text:
tb.send_message(chat_id, text)


if __name__ == '__main__':
if not os.path.exists('news.txt'):
with open('news.txt', 'w') as f:
f.write('[]')

send_news()
schedule.every(5).minutes.do(send_news)
while True:
schedule.run_pending()
time.sleep(1)
File renamed without changes.
File renamed without changes.
10 changes: 10 additions & 0 deletions src/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
beautifulsoup4==4.11.1
certifi==2022.5.18.1
charset-normalizer==2.0.12
idna==3.3
pyTelegramBotAPI==4.5.1
python-dotenv==0.20.0
requests==2.28.0
schedule==1.1.0
soupsieve==2.3.2.post1
urllib3==1.26.9
8 changes: 2 additions & 6 deletions test.py → src/test.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
import os
import json
# from dotenv import load_dotenv
from crawlers.ntnucsie import Crawler

from crawlers.nycucs import Crawler

if __name__ == '__main__':
# init
Expand All @@ -13,8 +11,6 @@
with open('news.txt', 'w') as f:
f.write('[]')

# load_dotenv()
# print(os.getenv('TOKEN'), os.getenv('CHAT_ID'))
c = Crawler()

print(json.dumps(c.get_update(), indent=2, ensure_ascii=False))

0 comments on commit aa2088c

Please sign in to comment.