Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

odo debug port-forward uses wrong port #3840

Closed
kadel opened this issue Aug 26, 2020 · 14 comments
Closed

odo debug port-forward uses wrong port #3840

kadel opened this issue Aug 26, 2020 · 14 comments
Labels
kind/bug Categorizes issue or PR as related to a bug. lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. priority/Medium Nice to have issue. Getting it done before priority changes would be great.

Comments

@kadel
Copy link
Member

kadel commented Aug 26, 2020

/kind bug

How did you run odo exactly?

Have devfile with debug command that defines env variable for debug port in debug command like this

  - exec:
      env:
         - name: DEBUG_PORT
           value: "5859"
      commandLine: "pip install --user debugpy && python -m debugpy --listen 0.0.0.0:${DEBUG_PORT} app.py"
      id: py-debug
      workingDir: /projects
      component: py-web
      group:
        kind: debug   

Running

▶ odo debug port-forward
Started port forwarding at ports - 5858:5858

Actual behavior

Using port 5858 for port forwarding

Expected behavior

It should use 5959, because that is where the debugger is listening based on DEBUG_PORT env variable

Any logs, error output, etc?

/priority medium
/area debug

@openshift-ci-robot openshift-ci-robot added kind/bug Categorizes issue or PR as related to a bug. priority/Medium Nice to have issue. Getting it done before priority changes would be great. area/debug labels Aug 26, 2020
@prietyc123
Copy link
Contributor

It should use 5959, because that is where the debugger is listening based on DEBUG_PORT env variable

@kadel You mean 5859 according to

- exec:
      env:
         - name: DEBUG_PORT
           value: "5859"
      commandLine: "pip install --user debugpy && python -m debugpy --listen 0.0.0.0:${DEBUG_PORT} app.py"
      id: py-debug
      workingDir: /projects
      component: py-web
      group:
        kind: debug 

right?

@sarveshtamba
Copy link
Contributor

@prietyc123
1.) Is this issue seen across all platforms including x86_64?
2.) Any timelines for resolving this, since this is kind of a blocker for ppc64le progress?

@mik-dass
Copy link
Contributor

@kadel Currently port-forward uses the DEBUG_PORT value from the env.yaml for devfile components. Should we remove odo env set DebugPort <value>? Or in such a case, prioritize the devfile command defined one? Also what should odo do if the DEBUG_PORT is defined in the env field for the component container in the devfile?

@kadel
Copy link
Member Author

kadel commented Aug 27, 2020

@prietyc123
1.) Is this issue seen across all platforms including x86_64?
2.) Any timelines for resolving this, since this is kind of a blocker for ppc64le progress?

@sarveshtamba problems described in #3502 are not related to this.

@kadel
Copy link
Member Author

kadel commented Aug 27, 2020

@kadel Currently port-forward uses the DEBUG_PORT value from the env.yaml for devfile components. Should we remove odo env set DebugPort <value>? Or in such a case, prioritize the devfile command defined one? Also what should odo do if the DEBUG_PORT is defined in the env field for the component container in the devfile?

I thought that odo env set DebugPort <value> sets local port not the remote port in container.

@mik-dass
Copy link
Contributor

I thought that odo env set DebugPort sets local port not the remote port in container.

I checked again and odo env set DebugPort <value> sets the value of the remote port.

[mrinaldas@localhost project]$ odo env set DebugPort 3000
? DebugPort is already set. Do you want to overri? DebugPort is already set. Do you want to override it in the environment Yes
Environment was successfully updated
[mrinaldas@localhost project]$ odo push --debug
Validation
 ✓  Validating the devfile [39368ns]

Creating Kubernetes resources for component nodejs
 ✓  Waiting for component to start [18s]

Applying URL changes
 ⚠  Unable to create ingress, missing host information for Endpoint http-3000, please check instructions on URL creation (refer `odo url create --help`)

 ✓  URLs are synced with the cluster, no changes are required.

Syncing to component nodejs
 ✓  Checking file changes for pushing [208004ns]
 ✓  Syncing files to the component [164ms]

Executing devfile commands for component nodejs
 ✓  Executing install command "npm install", if not running [5s]
 ✓  Executing debug command "npm run debug", if not running [1s]

Pushing devfile component nodejs
 ✓  Changes successfully pushed to component
[mrinaldas@localhost project]$ odo debug port-forward 
Started port forwarding at ports - 5858:3000

@scottkurz
Copy link
Contributor

To clarify, the issue is that the wrong remote port is used by port-forward in the case that you specify an "env" in the devfile configuring the debug command, e.g. :

  - id: debug
    exec:
      env:
         - name: DEBUG_PORT
           value: "7777"
      component: devruntime 
      commandLine: mvn -Dmaven.repo.local=/mvn/repository -Dliberty.runtime.version=20.0.0.9 -DdebugPort=${DEBUG_PORT} liberty:dev -Dliberty.env.WLP_DEBUG_REMOTE=y
      workingDir: /projects
      hotReloadCapable: true
      group:
        kind: debug
        isDefault: true

Running with the v2.0.0 GA, the odo debug port-forward will use as the remote port, either:

  • the env.yaml entry (set via odo env set DebugPort 7890 )
    OR
  • if nothing is set in the env.yaml, it will use the default port 5858.

I don't think there's any issue with the local port though.

The problem with port-forward giving precedence to the env.yaml entry is that the 'debug' container will give precedence to the devfile env in substituting in the commandLine: "....${DEBUG_PORT} ... " in the debug container.

@openshift-bot
Copy link

Issues go stale after 90d of inactivity.

Mark the issue as fresh by commenting /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.
Exclude this issue from closing by commenting /lifecycle frozen.

If this issue is safe to close now please do so with /close.

/lifecycle stale

@openshift-ci-robot openshift-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Dec 24, 2020
@dharmit
Copy link
Member

dharmit commented Dec 24, 2020

@mik-dass @kadel is this still important for us to address?

@sarveshtamba
Copy link
Contributor

/remove-lifecycle stale

@openshift-ci-robot openshift-ci-robot removed the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Dec 24, 2020
@openshift-bot
Copy link

Issues go stale after 90d of inactivity.

Mark the issue as fresh by commenting /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.
Exclude this issue from closing by commenting /lifecycle frozen.

If this issue is safe to close now please do so with /close.

/lifecycle stale

@openshift-ci-robot openshift-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Mar 24, 2021
@openshift-bot
Copy link

Stale issues rot after 30d of inactivity.

Mark the issue as fresh by commenting /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.
Exclude this issue from closing by commenting /lifecycle frozen.

If this issue is safe to close now please do so with /close.

/lifecycle rotten
/remove-lifecycle stale

@openshift-ci-robot openshift-ci-robot added lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. and removed lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. labels Apr 23, 2021
@openshift-bot
Copy link

Rotten issues close after 30d of inactivity.

Reopen the issue by commenting /reopen.
Mark the issue as fresh by commenting /remove-lifecycle rotten.
Exclude this issue from closing again by commenting /lifecycle frozen.

/close

@openshift-ci
Copy link

openshift-ci bot commented May 24, 2021

@openshift-bot: Closing this issue.

In response to this:

Rotten issues close after 30d of inactivity.

Reopen the issue by commenting /reopen.
Mark the issue as fresh by commenting /remove-lifecycle rotten.
Exclude this issue from closing again by commenting /lifecycle frozen.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@openshift-ci openshift-ci bot closed this as completed May 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug. lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. priority/Medium Nice to have issue. Getting it done before priority changes would be great.
Projects
None yet
Development

No branches or pull requests

8 participants