Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mirrord with trace logs panics with node app #3008

Open
meowjesty opened this issue Jan 7, 2025 · 2 comments
Open

mirrord with trace logs panics with node app #3008

meowjesty opened this issue Jan 7, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@meowjesty
Copy link
Member

meowjesty commented Jan 7, 2025

Bug Description

Managed to reproduce it with a nodejs app that tries to open a file that doesn't exist neither in my local machine, nor in the remote target pod. The crash doesn't happen on debug logs.

  • app.mjs
import fs from 'fs';

fs.open("/app/file.local", (fail, fd) => {
  console.log(`open file.local ${fd}`);
  console.error(fail);
});

I'm running it with: MIRRORD_PROGRESS_MODE=off RUST_LOG=mirrord=trace,warn mirrord exec --config-file mirrord.json node app.mjs

Steps to Reproduce

  1. Run nodejs app with mirrord (trace logs);

Backtrace

thread '<unnamed>' panicked at /home/meowjesty/.rustup/toolchains/nightly-2024-10-11-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/thread/local.rs:260:26:
cannot access a Thread Local Storage value during or after destruction: AccessError
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
thread '<unnamed>' panicked at core/src/panicking.rs:221:5:
panic in a function that cannot unwind
stack backtrace:
   0:     0x7f6109087a0a - std::backtrace_rs::backtrace::libunwind::trace::h1f2bdbcddcdf7b49
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/../../backtrace/src/backtrace/libunwind.rs:116:5
   1:     0x7f6109087a0a - std::backtrace_rs::backtrace::trace_unsynchronized::h3f655e61c08a7c9d
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x7f6109087a0a - std::sys::backtrace::_print_fmt::hf765b11a3e4e09be
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/sys/backtrace.rs:66:9
   3:     0x7f6109087a0a - <std::sys::backtrace::BacktraceLock::print::DisplayBacktrace as core::fmt::Display>::fmt::h87acf85b5211808c
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/sys/backtrace.rs:39:26
   4:     0x7f61090afd93 - core::fmt::rt::Argument::fmt::hcc0469ac70529a47
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/fmt/rt.rs:177:76
   5:     0x7f61090afd93 - core::fmt::write::h1830300cff8013e6
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/fmt/mod.rs:1186:21
   6:     0x7f6109083ff3 - std::io::Write::write_fmt::hff4c51c59db7db45
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/io/mod.rs:1839:15
   7:     0x7f6109087852 - std::sys::backtrace::BacktraceLock::print::h5e8429acc66fa0b4
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/sys/backtrace.rs:42:9
   8:     0x7f61090887fc - std::panicking::default_hook::{{closure}}::h47c5621b8a2933e4
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/panicking.rs:268:22
   9:     0x7f6109088642 - std::panicking::default_hook::h4c51255dc678cdff
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/panicking.rs:295:9
  10:     0x7f6109088e37 - std::panicking::rust_panic_with_hook::hfa48cfa9339c55df
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/panicking.rs:801:13
  11:     0x7f6109088c96 - std::panicking::begin_panic_handler::{{closure}}::hf82473ea8efddcc4
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/panicking.rs:667:13
  12:     0x7f6109087ee9 - std::sys::backtrace::__rust_end_short_backtrace::ha3aa6b6ac85645ad
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/sys/backtrace.rs:170:18
  13:     0x7f610908895c - rust_begin_unwind
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/std/src/panicking.rs:665:5
  14:     0x7f61090ad70d - core::panicking::panic_nounwind_fmt::runtime::h7df6b5d94f5a31c1
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/panicking.rs:112:18
  15:     0x7f61090ad70d - core::panicking::panic_nounwind_fmt::h96fc3e22c4306096
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/panicking.rs:122:5
  16:     0x7f61090ad7a2 - core::panicking::panic_nounwind::h79efc6f334f6baed
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/panicking.rs:221:5
  17:     0x7f61090ad966 - core::panicking::panic_cannot_unwind::hff8fa5fb0642a893
                               at /rustc/52fd9983996d9fcfb719749838336be66dee68f9/library/core/src/panicking.rs:310:5
  18:     0x7f610840020f - mirrord_layer::socket::hooks::fcntl_detour::h8c19cd82d94cf29a
                               at /home/meowjesty/dev/metalbear/mirrord/mirrord/layer/src/socket/hooks.rs:194:1
  19:     0x7f6107871d24 - <unknown>
thread caused non-unwinding panic. aborting.

mirrord layer logs

2025-01-07T21:42:17.927536Z TRACE ThreadId(02) open_logic:open: mirrord_layer::file::ops: new raw_path=0x7ffca29f7690 open_flags=524546 _mode=0 path=Success("/dev/pts/4") open_options=OpenOptionsInternal { read: true, write: true, append: false, truncate: false, create: false, create_new: false }
2025-01-07T21:42:17.927549Z TRACE ThreadId(02) open_logic:open: mirrord_layer::file::ops: return=Bypass(IgnoredFile("/dev/pts/4")) raw_path=0x7ffca29f7690 open_flags=524546 _mode=0 path=Success("/dev/pts/4") open_options=OpenOptionsInternal { read: true, write: true, append: false, truncate: false, create: false, create_new: false }
2025-01-07T21:42:17.927560Z TRACE ThreadId(02) open_logic:open: mirrord_layer::file::ops: close time.busy=14.1µs time.idle=9.76µs raw_path=0x7ffca29f7690 open_flags=524546 _mode=0 path=Success("/dev/pts/4") open_options=OpenOptionsInternal { read: true, write: true, append: false, truncate: false, create: false, create_new: false }
2025-01-07T21:42:17.927571Z TRACE ThreadId(02) open_logic: mirrord_layer::file::hooks: return=Bypass(IgnoredFile("/dev/pts/4")) raw_path=0x7ffca29f7690 open_flags=524546 _mode=0
2025-01-07T21:42:17.927580Z TRACE ThreadId(02) open_logic: mirrord_layer::file::hooks: close time.busy=58.6µs time.idle=7.87µs raw_path=0x7ffca29f7690 open_flags=524546 _mode=0
2025-01-07T21:42:17.927597Z TRACE ThreadId(02) dup: mirrord_layer::socket::ops: new fd=21 dup_fd=2
2025-01-07T21:42:17.927605Z TRACE ThreadId(02) dup: mirrord_layer::socket::ops: return=Ok(()) fd=21 dup_fd=2
2025-01-07T21:42:17.927614Z TRACE ThreadId(02) dup: mirrord_layer::socket::ops: close time.busy=9.29µs time.idle=7.85µs fd=21 dup_fd=2
open file.local undefined
[Error: ENOENT: no such file or directory, open '/app/file.local'] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'open',
  path: '/app/file.local'
}
2025-01-07T21:42:17.929902Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: new orig_fd=0 cmd=3 fcntl_fd=2
2025-01-07T21:42:17.929927Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: close time.busy=3.01µs time.idle=23.1µs orig_fd=0 cmd=3 fcntl_fd=2
2025-01-07T21:42:17.929951Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: new orig_fd=1 cmd=3 fcntl_fd=32770
2025-01-07T21:42:17.929961Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: close time.busy=1.16µs time.idle=9.50µs orig_fd=1 cmd=3 fcntl_fd=32770
2025-01-07T21:42:17.929975Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: new orig_fd=2 cmd=3 fcntl_fd=32770
2025-01-07T21:42:17.929984Z TRACE ThreadId(02) fcntl: mirrord_layer::socket::ops: close time.busy=871ns time.idle=7.94µs orig_fd=2 cmd=3 fcntl_fd=32770

mirrord intproxy logs

2025-01-07T21:42:17.863696Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, LayerForked(LayerForked { child: LayerId(1), parent: LayerId(0) })
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:17.865258Z TRACE mirrord_intproxy::layer_conn: message bus closed, exiting
    at mirrord/intproxy/src/layer_conn.rs:63

  2025-01-07T21:42:17.865378Z TRACE mirrord_intproxy: layer connection 1 closed
    at mirrord/intproxy/src/lib.rs:236

  2025-01-07T21:42:17.865435Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, LayerClosed(LayerClosed { id: LayerId(1) })
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:17.902300Z  INFO mirrord_intproxy::layer_initializer: new session, process_info: ProcessInfo { pid: 120114, name: "node", cmdline: ["node", "app.mjs"], loaded: true }
    at mirrord/intproxy/src/layer_initializer.rs:63
    in mirrord_intproxy::layer_initializer::handle_new_stream with self: LayerInitializer { listener: PollEvented { io: Some(TcpListener { addr: 127.0.0.1:57571, fd: 11 }) }, next_layer_id: LayerId(2) }, stream: PollEvented { io: Some(TcpStream { addr: 127.0.0.1:57571, peer: 127.0.0.1:54558, fd: 13 }) }

  2025-01-07T21:42:17.902359Z TRACE mirrord_intproxy::layer_initializer: return: Ok(NewLayer { stream: PollEvented { io: Some(TcpStream { addr: 127.0.0.1:57571, peer: 127.0.0.1:54558, fd: 13 }) }, id: LayerId(2), parent_id: None })
    at mirrord/intproxy/src/layer_initializer.rs:46
    in mirrord_intproxy::layer_initializer::handle_new_stream with self: LayerInitializer { listener: PollEvented { io: Some(TcpListener { addr: 127.0.0.1:57571, fd: 11 }) }, next_layer_id: LayerId(2) }, stream: PollEvented { io: Some(TcpStream { addr: 127.0.0.1:57571, peer: 127.0.0.1:54558, fd: 13 }) }

  2025-01-07T21:42:17.923190Z TRACE mirrord_intproxy: return: Ok(())
    at mirrord/intproxy/src/lib.rs:350
    in mirrord_intproxy::handle_layer_message with FromLayer { message_id: 1, layer_id: LayerId(2), message: File(Open(OpenFileRequest { path: "/app/file.local", open_options: OpenOptionsInternal { read: true, write: false, append: false, truncate: false, create: false, create_new: false } })) }

  2025-01-07T21:42:17.923274Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, FileReq(1, LayerId(2), Open(OpenFileRequest { path: "/app/file.local", open_options: OpenOptionsInternal { read: true, write: false, append: false, truncate: false, create: false, create_new: false } }))
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:17.923402Z TRACE mirrord_intproxy::agent_conn: return: Ok(())
    at mirrord/intproxy/src/agent_conn.rs:191
    in mirrord_intproxy::agent_conn::send_agent_message with msg: FileRequest(Open(OpenFileRequest { path: "/app/file.local", open_options: OpenOptionsInternal { read: true, write: false, append: false, truncate: false, create: false, create_new: false } }))

  2025-01-07T21:42:17.924634Z TRACE mirrord_intproxy: return: Ok(())
    at mirrord/intproxy/src/lib.rs:273
    in mirrord_intproxy::handle_agent_message with File(Open(Err(RemoteIO(RemoteIOError { raw_os_error: Some(2), kind: NotFound }))))

  2025-01-07T21:42:17.924691Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, FileRes(Open(Err(RemoteIO(RemoteIOError { raw_os_error: Some(2), kind: NotFound }))))
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:17.924886Z TRACE mirrord_intproxy::layer_conn: return: Ok(())
    at mirrord/intproxy/src/layer_conn.rs:39
    in mirrord_intproxy::layer_conn::send_layer_message with msg: LocalMessage { message_id: 1, inner: File(Open(Err(RemoteIO(RemoteIOError { raw_os_error: Some(2), kind: NotFound })))) }, layer_id: 2

  2025-01-07T21:42:18.240640Z TRACE mirrord_intproxy::layer_conn: message bus closed, exiting
    at mirrord/intproxy/src/layer_conn.rs:63

  2025-01-07T21:42:18.240885Z TRACE mirrord_intproxy: layer connection 2 closed
    at mirrord/intproxy/src/lib.rs:236

  2025-01-07T21:42:18.240989Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, LayerClosed(LayerClosed { id: LayerId(2) })
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:18.241911Z TRACE mirrord_intproxy::layer_conn: message bus closed, exiting
    at mirrord/intproxy/src/layer_conn.rs:63

  2025-01-07T21:42:18.242025Z TRACE mirrord_intproxy: layer connection 0 closed
    at mirrord/intproxy/src/lib.rs:236

  2025-01-07T21:42:18.242077Z TRACE mirrord_intproxy::proxies::files: new message in message_bus, LayerClosed(LayerClosed { id: LayerId(0) })
    at mirrord/intproxy/src/proxies/files.rs:779

  2025-01-07T21:42:23.243458Z TRACE mirrord_intproxy: intproxy timeout, no active connections. Exiting.
    at mirrord/intproxy/src/lib.rs:158

  2025-01-07T21:42:23.243710Z TRACE mirrord_intproxy::layer_initializer: message bus closed, exiting
    at mirrord/intproxy/src/layer_initializer.rs:99

  2025-01-07T21:42:23.243890Z TRACE mirrord_intproxy::agent_conn: message bus closed, exiting
    at mirrord/intproxy/src/agent_conn.rs:213

  2025-01-07T21:42:23.243983Z TRACE mirrord_intproxy::proxies::simple: message bus closed, exiting
    at mirrord/intproxy/src/proxies/simple.rs:94

  2025-01-07T21:42:23.244056Z TRACE mirrord_intproxy::proxies::outgoing: message bus closed, exiting
    at mirrord/intproxy/src/proxies/outgoing.rs:242

  2025-01-07T21:42:23.244184Z TRACE mirrord_intproxy::proxies::incoming: message bus closed, exiting
    at mirrord/intproxy/src/proxies/incoming.rs:502
    in mirrord_intproxy::proxies::incoming::run

  2025-01-07T21:42:23.244349Z TRACE mirrord_intproxy::ping_pong: message bus closed, exiting
    at mirrord/intproxy/src/ping_pong.rs:85

  2025-01-07T21:42:23.244420Z TRACE mirrord_intproxy::proxies::files: message bus closed, exiting
    at mirrord/intproxy/src/proxies/files.rs:797

  2025-01-07T21:42:23.244492Z TRACE mirrord_kube::api: No more client messages, disconnecting
    at mirrord/kube/src/api.rs:38
    in mirrord_kube::api::wrap_raw_connection
    in mirrord::internal_proxy::connect_and_ping with config: LayerConfig { accept_invalid_certificates: None, skip_processes: None, skip_build_tools: true, connect_tcp: None, operator: Some(false), kubeconfig: None, sip_binaries: None, target: TargetConfig { path: Some(Deployment(DeploymentTarget { deployment: "sample-python-deployment", container: None })), namespace: None }, agent: AgentConfig { log_level: "mirrord=debug", json_log: false, namespace: None, image: AgentImageConfig("meowjesty/mirrord-agent:latest"), image_pull_policy: "Always", image_pull_secrets: None, ttl: 90, ephemeral: false, communication_timeout: None, startup_timeout: 60, network_interface: None, flush_connections: true, disabled_capabilities: None, tolerations: None, resources: None, check_out_of_pods: true, privileged: false, nftables: false, dns: AgentDnsConfig { timeout: None, attempts: None }, labels: None, annotations: None, node_selector: None, service_account: None, test_error: false }, container: ContainerConfig { cli_image: "ghcr.io/metalbear-co/mirrord-cli:3.128.0", cli_extra_args: [], cli_prevent_cleanup: false, cli_image_lib_path: "/opt/mirrord/lib/libmirrord_layer.so" }, feature: FeatureConfig { env: EnvConfig { include: None, exclude: None, override: None, load_from_process: None, unset: None, env_file: None, mapping: None }, fs: FsConfig { mode: Write, read_write: None, read_only: None, local: None, not_found: None, mapping: None }, network: NetworkConfig { incoming: IncomingConfig { port_mapping: {}, ignore_localhost: false, ignore_ports: {}, mode: Steal, http_filter: HttpFilterConfig { header_filter: None, path_filter: None, all_of: None, any_of: None, ports: PortList(Multiple([80, 8080])) }, listen_ports: {}, on_concurrent_steal: Abort, ports: None }, outgoing: OutgoingConfig { tcp: true, udp: true, ignore_localhost: false, filter: None, unix_streams: None }, dns: DnsConfig { enabled: true, filter: None } }, copy_target: CopyTargetConfig { enabled: false, scale_down: false }, hostname: true, split_queues: SplitQueuesConfig({}) }, telemetry: true, kube_context: None, internal_proxy: InternalProxyConfig { connect_tcp: None, client_tls_certificate: None, client_tls_key: None, start_idle_timeout: 60, idle_timeout: 5, socket_timeout: 31536000, log_level: None, log_destination: None, container_mode: false }, external_proxy: ExternalProxyConfig { tls_enable: true, tls_certificate: None, tls_key: None, start_idle_timeout: 60, idle_timeout: 5, log_level: None, log_destination: None }, use_proxy: true, experimental: ExperimentalConfig { tcp_ping4_mock: true, readlink: false, trust_any_certificate: false, enable_exec_hooks_linux: true, hide_ipv6_interfaces: false, disable_reuseaddr: false, use_dev_null: true, readonly_file_buffer: 128000 } }, connect_info: Some(DirectKubernetes(AgentKubernetesConnectInfo { pod_name: "mirrord-agent-dkz630jfwt-hpj6v", agent_port: 43696, namespace: None, agent_version: Some("3.128.0") })), analytics: AnalyticsReporter { enabled: true, error_only_send: false, analytics: Analytics { data: {"readlink": Bool(false), "disable_reuseaddr": Bool(false), "split_queues": Nested(Analytics { data: {"sqs_queue_count": Number(0), "kafka_queue_count": Number(0)} }), "network": Nested(Analytics { data: {"incoming": Nested(Analytics { data: {"http": Nested(Analytics { data: {"ports": Number(2), "path_filter": Bool(false), "header_filter": Bool(false)} }), "mode": Number(1), "port_mapping_count": Number(0), "listen_ports_count": Number(0), "concurrent_steal": Number(2), "ignore_localhost": Bool(false), "ignore_ports_count": Number(0)} }), "outgoing": Nested(Analytics { data: {"ignore_localhost": Bool(false), "unix_streams": Number(0), "tcp": Bool(true), "udp": Bool(true)} }), "dns": Nested(Analytics { data: {"enabled": Bool(true)} })} }), "tcp_ping4_mock": Bool(true), "use_kubeconfig": Bool(false), "fs": Nested(Analytics { data: {"not_found_paths": Number(0), "mode": Number(3), "read_only_paths": Number(0), "local_paths": Number(0), "read_write_paths": Number(0)} }), "env": Nested(Analytics { data: {"env_mapping_count": Number(0), "unset_count": Number(0), "exclude_count": Number(0), "env_file_used": Bool(false), "overrides_count": Number(0), "include_count": Number(0)} }), "copy_target": Nested(Analytics { data: {"enabled": Bool(false), "scale_down": Bool(false)} }), "hostname": Bool(true), "readonly_file_buffer": Number(128000), "ephemeral": Bool(false), "hide_ipv6_interfaces": Bool(false), "execution_kind": Number(2), "trust_any_certificate": Bool(false), "target_mode": Number(4), "enable_exec_hooks_linux": Bool(true)} }, error: None, start_instant: Instant { tv_sec: 27360, tv_nsec: 776019371 }, operator_properties: None, watch: Watch { .. } }

  2025-01-07T21:42:23.250333Z TRACE mirrord_intproxy: AGENT_CONNECTION result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250346Z TRACE mirrord_intproxy: SIMPLE_PROXY result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250348Z TRACE mirrord_intproxy: OUTGOING_PROXY result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250351Z TRACE mirrord_intproxy: PING_PONG result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250353Z TRACE mirrord_intproxy: INCOMING_PROXY result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250355Z TRACE mirrord_intproxy: FILES_PROXY result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.250357Z TRACE mirrord_intproxy: LAYER_INITIALIZER result: Ok(())
    at mirrord/intproxy/src/lib.rs:169

  2025-01-07T21:42:23.256606Z  WARN mirrord_kube::api::kubernetes::portforwarder: finished retry strategy, closing connection, connect_info: AgentKubernetesConnectInfo { pod_name: "mirrord-agent-dkz630jfwt-hpj6v", agent_port: 43696, namespace: None, agent_version: Some("3.128.0") }
    at mirrord/kube/src/api/kubernetes/portforwarder.rs:168

mirrord agent logs

No response

mirrord config

{
  "target": "deployment/sample-python-deployment",
  "agent": {
    "log_level": "mirrord=debug",
    "image": "meowjesty/mirrord-agent:latest",
    "image_pull_policy": "Always",
    "startup_timeout": 60,
    "ttl": 90
  },
  "operator": false,
  "feature": {
    "network": {
      "incoming": "steal",
      "dns": true
    },
    "fs": "write"
  }
}

mirrord CLI version

No response

mirrord-agent version

No response

mirrord-operator version (if relevant)

No response

plugin kind and version (if relevant)

No response

Your operating system and version

linux

Local process

app.mjs

Local process version

No response

Additional Info

No response

@meowjesty meowjesty added the bug Something isn't working label Jan 7, 2025
Copy link

linear bot commented Jan 7, 2025

@t4lz
Copy link
Member

t4lz commented Jan 26, 2025

Got a similar crash with:

RUST_LOG=mirrord=trace mirrord-dev exec node tests/node-e2e/outgoing/test_outgoing_traffic_single_request.mjs

(where mirrord-dev is the locally-built mirrord binary)

crash output

thread '<unnamed>' panicked at /Users/tal/.rustup/toolchains/nightly-2024-10-11-aarch64-apple-darwin/lib/rustlib/src/rust/library/std/src/thread/local.rs:260:26:
cannot access a Thread Local Storage value during or after destruction: AccessError
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
thread '<unnamed>' panicked at core/src/panicking.rs:221:5:
panic in a function that cannot unwind
stack backtrace:
[1]    60557 illegal hardware instruction  RUST_LOG=mirrord=trace  exec node 

And with backtrace:

RUST_BACKTRACE=full RUST_LOG=mirrord=trace mirrord-dev exec node tests/node-e2e/outgoing/test_outgoing_traffic_single_request.mjs

crash output

thread '<unnamed>' panicked at /Users/tal/.rustup/toolchains/nightly-2024-10-11-aarch64-apple-darwin/lib/rustlib/src/rust/library/std/src/thread/local.rs:260:26:
cannot access a Thread Local Storage value during or after destruction: AccessError
stack backtrace:
[1]    86419 illegal hardware instruction  RUST_BACKTRACE=full RUST_LOG=mirrord=trace  exec node 

Does not happen with released binaries: the does not happen for example with the released 3.130.0 binary, but when I checked out that tag and built a debug build, the crash did happen with that binary.

Since it only happens with trace level logs and only in debug builds, I would say it's probably related to mirrord_layer_macro::instrument.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants