Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

common: general improvements #695

Closed
wants to merge 7 commits into from

Conversation

dougsland
Copy link
Collaborator

@dougsland dougsland commented Feb 1, 2025

  • if timeout happens we try 5 times before sending Timeout error to users.

  • improve user experience when timeout occurs

  • Add logging to handle message info/warn/error

  • Add MIT header

Resolves: #693

Summary by Sourcery

Enhancements:

  • Improve the user experience when a timeout occurs during download.

Copy link
Contributor

sourcery-ai bot commented Feb 1, 2025

Reviewer's Guide by Sourcery

This pull request introduces several improvements to the common module, including retry logic for downloads, enhanced error handling, and logging. It also adds an MIT license header.

Sequence diagram for improved file download with retry logic

sequenceDiagram
    participant C as Client
    participant D as download_file()
    participant H as HttpClient
    participant S as Server

    C->>D: download_file(url, dest_path)
    activate D
    rect rgb(200, 255, 200)
        note over D: Retry loop (max 5 attempts)
        D->>H: init(url, headers, output_file)
        activate H
        H->>S: HTTP GET request
        alt Success
            S-->>H: File data
            H-->>D: Success
            D-->>C: Return
        else Timeout/Error
            S--xH: Timeout/Error
            H--xD: Error
            note over D: Log error & wait
            note over D: Exponential backoff
            D->>H: Retry request
        end
        deactivate H
    end
    alt Max retries reached
        D--xC: Exit with error
    end
    deactivate D
Loading

Class diagram showing download_file function changes

classDiagram
    class download_file {
        +download_file(url: str, dest_path: str, headers: dict, show_progress: bool)
    }

    class HttpClient {
        +init(url: str, headers: dict, output_file: str, progress: bool)
    }

    class Logger {
        +info(message: str)
        +warning(message: str)
        +error(message: str)
        +debug(message: str)
    }

    download_file ..> HttpClient : uses
    download_file ..> Logger : uses

    note for download_file "Added retry mechanism\nImproved error handling\nAdded logging"
Loading

State diagram for download retry mechanism

stateDiagram-v2
    [*] --> Downloading
    Downloading --> Success: Download complete
    Downloading --> RetryWait: Error/Timeout
    RetryWait --> Downloading: Retry attempt
    RetryWait --> Failed: Max retries reached
    Success --> [*]
    Failed --> [*]

    note right of RetryWait
        Exponential backoff:
        0.1s, 0.2s, 0.4s...
    end note
Loading

File-Level Changes

Change Details Files
Added an MIT license header to the file.
  • Added the MIT license header at the beginning of the file.
ramalama/common.py
Implemented retry logic for file downloads.
  • Added a retry mechanism to the download_file function.
  • The function now retries up to 5 times before failing.
  • Implemented exponential backoff between retries.
ramalama/common.py
Improved error handling and logging during file downloads.
  • Added logging for successful downloads.
  • Added logging for HTTP errors, network errors, and timeouts.
  • Added logging for unexpected errors.
  • Improved the error message when download fails after multiple retries.
  • Added debug logging for unexpected errors.
ramalama/common.py
Added basic logging configuration.
  • Configured basic logging with a warning level.
  • Set the format for log messages.
ramalama/common.py

Possibly linked issues


Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@dougsland
Copy link
Collaborator Author

Tested on macOS, I will give a try on Linux, please wait.

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @dougsland - I've reviewed your changes - here's some feedback:

Overall Comments:

  • The download_file() function should raise RuntimeError after max retries instead of calling sys.exit(1). This allows callers to handle the error appropriately.
Here's what I looked at during the review
  • 🟡 General issues: 2 issues found
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

ramalama/common.py Show resolved Hide resolved
ramalama/common.py Show resolved Hide resolved
@dougsland
Copy link
Collaborator Author

On macOS:

$ ramalama run deepseek-r1:8b          
 20% |███████████████████████████████                                                                                                                              |  981.34 MB/   4.58 GB   9.36 MB/s    8m 17sWARNING:root:⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
ERROR:root:🚨 Unexpected error: Network error: [Errno 8] nodename nor servname provided, or not known
ERROR:root:🚨 Unexpected error: Network error: [Errno 8] nodename nor servname provided, or not known
ERROR:root:🚨 Unexpected error: Network error: [Errno 8] nodename nor servname provided, or not known
ERROR:root:🚨 Unexpected error: Network error: [Errno 8] nodename nor servname provided, or not known

❌ Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request
$

@dougsland
Copy link
Collaborator Author

dougsland commented Feb 1, 2025

On Linux output seems good too:

douglas@fedora:~/linux-test/ramalama$ ramalama run deepseek-r1:8b
  3% |█████                                                                                                                                                                                       |  142.34 MB/   4.58 GB  26.32 MB/s    2m 52s2025-01-31 23:43:52,181 - WARNING - ⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
2025-01-31 23:43:52,384 - ERROR - 🚨 Unexpected error: Network error: [Errno -3] Temporary failure in name resolution
2025-01-31 23:43:52,785 - ERROR - 🚨 Unexpected error: Network error: [Errno -3] Temporary failure in name resolution
2025-01-31 23:43:53,587 - ERROR - 🚨 Unexpected error: Network error: [Errno -3] Temporary failure in name resolution
2025-01-31 23:43:55,189 - ERROR - 🚨 Unexpected error: Network error: [Errno -3] Temporary failure in name resolution

@dougsland
Copy link
Collaborator Author

Holy moly these AI bots on github are getting good, let implement the suggestions.

@dougsland dougsland force-pushed the timeout-handle branch 3 times, most recently from 4ce2fee to 6ebcbde Compare February 1, 2025 05:09
@dougsland
Copy link
Collaborator Author

I believe it's ready for review :)

@dougsland
Copy link
Collaborator Author

CI/CD error not related to the patch.

@dougsland
Copy link
Collaborator Author

@rhatdan PTAL

@ericcurtin
Copy link
Collaborator

We should check if the terminal actually supports emojis before printing them, this is how systemd does it:

https://github.com/systemd/systemd/blob/75ee025c5de5d753dc1d8a28f8780247f5a887ae/src/basic/glyph-util.c#L8

@ericcurtin ericcurtin closed this Feb 1, 2025
@ericcurtin ericcurtin reopened this Feb 1, 2025
@dougsland
Copy link
Collaborator Author

dougsland commented Feb 1, 2025

We should check if the terminal actually supports emojis before printing them, this is how systemd does it:

https://github.com/systemd/systemd/blob/75ee025c5de5d753dc1d8a28f8780247f5a887ae/src/basic/glyph-util.c#L8

I will skip emoji for now I guess to make it less complicated. :(

@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 1, 2025

We should check if the terminal actually supports emojis before printing them, this is how systemd does it:
https://github.com/systemd/systemd/blob/75ee025c5de5d753dc1d8a28f8780247f5a887ae/src/basic/glyph-util.c#L8

I will skip emoji for now I guess to make it less complicated. :(

It would be something like this FWIW, if we get around to it:

def is_locale_utf8():
    return 'UTF-8' in os.getenv('LC_CTYPE', '') or 'UTF-8' in os.getenv('LANG', '')

def emoji_enabled():
    term = os.getenv("TERM")
    if not term or term in ("dumb", "linux"):
        return False

    return is_locale_utf8()

@rhatdan
Copy link
Member

rhatdan commented Feb 1, 2025

The emojis are fun, but want to make sure they work on Windows, Linux and Mac terminals.

@dougsland
Copy link
Collaborator Author

We should check if the terminal actually supports emojis before printing them, this is how systemd does it:
https://github.com/systemd/systemd/blob/75ee025c5de5d753dc1d8a28f8780247f5a887ae/src/basic/glyph-util.c#L8

I will skip emoji for now I guess to make it less complicated. :(

It would be something like this FWIW, if we get around to it:

def is_locale_utf8():
    return 'UTF-8' in os.getenv('LC_CTYPE', '') or 'UTF-8' in os.getenv('LANG', '')

def emoji_enabled():
    term = os.getenv("TERM")
    if not term or term in ("dumb", "linux"):
        return False

    return is_locale_utf8()

Thanks tonight I will double check and test it.

@dougsland
Copy link
Collaborator Author

The emojis are fun, but want to make sure they work on Windows, Linux and Mac terminals.

Already tested on mac and linux, work great. Windows to be tested soon.

Added a condition to check if the source model already exists, otherwise pull it down for the convert or push (convert and push were the only cli functions that used _get_source as far as I could tell)

Signed-off-by: Kush Gupta <[email protected]>
Give the user a more descriptive error if getting a HTTP Error 404: Not Found

Signed-off-by: Kush Gupta <[email protected]>
Update new error message for not finding an Ollama model in the convert test

Signed-off-by: Kush Gupta <[email protected]>
Gives the user a clearer message on HTTP 404 Not Found Errors when trying and failing to pull from Ollama

Signed-off-by: Kush Gupta <[email protected]>
@dougsland dougsland force-pushed the timeout-handle branch 2 times, most recently from 9a9dab5 to 2f44162 Compare February 2, 2025 15:47
@ericcurtin
Copy link
Collaborator

The emojis are fun, but want to make sure they work on Windows, Linux and Mac terminals.

Already tested on mac and linux, work great. Windows to be tested soon.

I think just copy the systemd technique, we won't be able to test every terminal, platform, combination, etc. systemd is deployed at crazy scale, they probably have it done as correctly as it can be within reason

ramalama/console.py Outdated Show resolved Hide resolved
@ericcurtin
Copy link
Collaborator

If we create a new file, we must add to install.sh , we have broken installs already because of this with gpu_detector

ramalama/console.py Outdated Show resolved Hide resolved
ramalama/console.py Outdated Show resolved Hide resolved
ramalama/console.py Outdated Show resolved Hide resolved
- if timeout happens we try 5 times before
  sending Timeout error to users.

- improve user experience when timeout occurs

- Added console source tree for handling messages

Resolves: containers#693
Signed-off-by: Douglas Schilling Landgraf <[email protected]>
@dougsland
Copy link
Collaborator Author

dougsland commented Feb 2, 2025

macOS:

export RAMALAMA_FORCE_EMOJI=false
ramalama run deepseek-r1:8b
 38% |█████████████████████████████████████████████████████████████                                                                                                      |    1.79 GB/   4.58 GB   6.33 MB/s   12m 14s2025-02-02 12:30:22,832 - WARNING - [WARNING] TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:30:23,036 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:30:23,439 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:30:24,245 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:30:25,848 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:30:25,848 - ERROR - [ERROR] 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request
$ export RAMALAMA_FORCE_EMOJI=""
ramalama run deepseek-r1:8b   
 39% |███████████████████████████████████████████████████████████████                                                                                                    |    1.81 GB/   4.58 GB   9.28 MB/s    8m 23s2025-02-02 12:32:19,210 - WARNING - ⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:32:19,417 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:32:19,825 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:32:20,633 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:32:22,240 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:32:22,240 - ERROR - ❌ 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request
 $ export RAMALAMA_FORCE_EMOJI="true"
 $ ramalama run deepseek-r1:8b       
 39% |███████████████████████████████████████████████████████████████                                                                                                    |    1.82 GB/   4.58 GB   3.25 MB/s   24m 00s2025-02-02 12:33:41,977 - WARNING - ⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:33:42,186 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:33:42,592 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:33:43,398 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:33:45,005 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:33:45,005 - ERROR - ❌ 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request

@dougsland
Copy link
Collaborator Author

macOS:

export RAMALAMA_FORCE_EMOJI=false
ramalama run deepseek-r1:8b
 38% |█████████████████████████████████████████████████████████████                                                                                                      |    1.79 GB/   4.58 GB   6.33 MB/s   12m 14s2025-02-02 12:30:22,832 - WARNING - [WARNING] TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:30:23,036 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:30:23,439 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:30:24,245 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:30:25,848 - WARNING - [WARNING] I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:30:25,848 - ERROR - [ERROR] 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request
$ export RAMALAMA_FORCE_EMOJI=""
ramalama run deepseek-r1:8b   
 39% |███████████████████████████████████████████████████████████████                                                                                                    |    1.81 GB/   4.58 GB   9.28 MB/s    8m 23s2025-02-02 12:32:19,210 - WARNING - ⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:32:19,417 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:32:19,825 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:32:20,633 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:32:22,240 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:32:22,240 - ERROR - ❌ 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request
 $ export RAMALAMA_FORCE_EMOJI="true"
 $ ramalama run deepseek-r1:8b       
 39% |███████████████████████████████████████████████████████████████                                                                                                    |    1.82 GB/   4.58 GB   3.25 MB/s   24m 00s2025-02-02 12:33:41,977 - WARNING - ⚠️  TimeoutError: The server took too long to respond. Retrying 1/5...
2025-02-02 12:33:42,186 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 2/5...
2025-02-02 12:33:42,592 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 3/5...
2025-02-02 12:33:43,398 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 4/5...
2025-02-02 12:33:45,005 - WARNING - ⚠️  I/O Error: Network error: [Errno 8] nodename nor servname provided, or not known. Retrying 5/5...
2025-02-02 12:33:45,005 - ERROR - ❌ 
Download failed after multiple attempts.
Possible causes:
- Internet connection issue
- Server is down or unresponsive
- Firewall or proxy blocking the request

same behaviour on Fedora 41

@dougsland
Copy link
Collaborator Author

@ericcurtin ready for review, i just didn't follow why I need to change install.sh. Make install seems to work just nice.

Copy link
Collaborator

@ericcurtin ericcurtin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need some changes here, for one we cannot break install.sh right now, people are using that to install right now

@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 2, 2025

"make install" requires a "git clone", that's not a reasonable thing to expect a user to do, a developer yes, but not a user. The install.sh also handles brew, dnf, apt, etc. People are relying on this, especially macOS users.

@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 2, 2025

Bug reported and bug fix from lack of updating the install.sh file:

#704

install.sh Outdated
local branch="${BRANCH:-s}"
local url="${host}/containers/ramalama/${branch}/bin/${from_file}"
local branch="main"
local url="${host}/kush-gupt/ramalama/${branch}/bin/${from_file}"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks broken.

@ericcurtin
Copy link
Collaborator

This is all we need to do for emoji-support detection:

def is_locale_utf8():
    # List of environment variables to check
    env_vars = ['LC_ALL', 'LC_CTYPE', 'LANG']
    for var in env_vars:
        value = os.environ.get(var, '')
        if 'UTF-8' in value:
            return True

    return False

def emoji_enabled():
    term = os.getenv("TERM")
    if term is None or term in ["dumb", "linux"]:
        return False

    return is_locale_utf8()

I didn't invent this technique or anything, I just ported this from systemd, it's more tried and tested than anything any of us would handcraft.

@ericcurtin
Copy link
Collaborator

But the FORCE or emoji_enabled() code with the cached variable looks fine.


def is_locale_utf8():
"""Check if the system locale is UTF-8."""
return 'UTF-8' in os.getenv('LC_CTYPE', '') or 'UTF-8' in os.getenv('LANG', '')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code looks great, LC_ALL is the other or you could add if you wanted

@dougsland
Copy link
Collaborator Author

Bug reported and bug fix from lack of updating the install.sh file:

#704

@ericcurtin install.sh should be improved to avoid breaking things (automatic detect new files or other mechanism). and also added to CI/CD @ericcurtin. Regarding the windows code, everything has a start, we don't need to start big. I just added as mr. @rhatdan requested. If he request to added Windows support, we will make it fly.

@dougsland
Copy link
Collaborator Author

Let me close this one, I mess with my sync branch request, let me do a fresh PR and link this one.

@dougsland dougsland closed this Feb 2, 2025
@ericcurtin
Copy link
Collaborator

ericcurtin commented Feb 2, 2025

@dougsland our Windows solution is not native Windows it's WSL2, I understand make sure it works on Windows is open to interpretation, I was trying to clarify that, it's documented here:

https://github.com/containers/ramalama/blob/main/docs/readme/wsl2-podman-cuda.md

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

(Timeout download model): python3.13/ssl.py", line 1138, in read return self._sslobj.read(len, buffer)
4 participants