Extract URLs out of the network traffic on Linux how?

Hello,

the web browser has no developer console and web page is made to refuse work if it is enabled. The web page source does not show any streamed video URL, so i was thinking i may run some Linux command to capture network trafic for lets say 5 minutes during video play and extract all http/s (i will be able to read URLs if HTTPS?) URLs. Anyone knows please which command to use?

I can find possible sed command to get that URLs, but do not know that network capture command. I was not successful with wireshark GUI.

Look into ngrep (network grep) GitHub - jpr5/ngrep: ngrep is like GNU grep applied to the network layer. It's a PCAP-based tool that allows you to specify an extended regular or hexadecimal expression to match against data payloads of packets. It understands many kinds of protocols, including IPv4/6, TCP, UDP, ICMPv4/6, IGMP and Raw, across a wide variety of interface types, and understands BPF filter logic in the same fashion as more common packet sniffing tools, such as tcpdump and snoop.

1 Like

Thanks for mentioning “ngrep”, i have tried it, BUT i think it is reporting only plain text traffic, not HTTPS traffic content… So i can not “intercept” HTTPS traffic going in/out of a web browser unless i use some special tool that can see what browser is internally doing (can some Linux system tools do this?)
Update: i have discovered that the video is DRM protected so i would not still be able to download that stream.

Ah, yes, HTTPS is always hard due to all security around it. However, you can use MITM proxy:

https://mitmproxy.org/

i have found httpry to help in this scenario

Nice. Do you think it will work on FreeBSD? It is not updated since 2014.


Linux sysadmin blog - Linux/Unix Howtos and Tutorials - Linux bash shell scripting wiki