unable to see codecs used in VOIP application using Wireshark - macos

Is there any way to see the codecs used in VOIP application in wireshark(G729,AMR…).
I want to analyse a VOIP appliocation ,in which i can see SIP methods only.I didn't find a way to see any codecs used.Also i tried to see RTP packets,which i couldn't find(i searched for rtp for filtering)find.I actuallally made a call from VOIP application and was only able to see SIP protocol.Do anyone have any idea to analyse SRTP on wireshark.I am using MAC system.I couldn't find RTP for the same call.Can anyone help?

If channel encrypted using SRTP so you can not analysis the packet to know the format of it.
If you want to know which codec in use better catch SIP messages from call beginning, but if signal SIP go on SSL/TLS channel so it can not read too.

Related

How to send an audio stream over SIP

I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)

SIP communication with Web socket (Web RTC)

Sip (session initiation protocol) does not understand websocket so we need sip proxy which is basically a translator between sip and websocket.
i am following this architecture for sip handshaking with web socket. I have few questions
which sip proxy must be used to make audio and video call. and in the Gateway to SIP module i am using ASTERISK. how asterisk can be used for video call is there any codec available for video call? Please share some useful links.
Your kind answers will be highly appreciated.
Check out http://jssip.net. They provide a javascript API which uses SIP over WebSocket for client-side and they also have a SIP proxy and server (also works with Asterisk,Kamailio). They are the authors of RFC7118 "The WebSocket Protocol as a Transport for the Session Initiation Protocol (SIP)".
that s only one way to do it. There are many ways.
you have to distinguish between the signaling path and the media path
on the signaling path, you have to choose a signalling protocol and corresponding transport protocol. A browser can use web socket for transport and sip for the protocol as far as signaling is concerned. On the legacy SIP side, you need SID over UDP, there is a need to change the transport of the signaling, not the protocol of the signaling.
On the media path, you have two problems, the encryption and the codec. The encryption is mandatory in webrtc and not in SIP. You need a B2BUA to make the transition between both words.
on the codec side, you either choose an overlapping codec between both words, or you have to transcode. The use of a media server seems mandatory here. If you have multiple parties in a conference, you will need to mix the audio and compose the video to send it to legacy SIP, in which case your media server should be an MCU.
Eventually, you also have a discovery and identity problem. During the original handshake, SIP is expecting a user ID and a domain (which is either a DNS entry or a fixed IP) while webRTC is using ICE. Here again, it is very likely that you need to use a B2BUA to bridge both world.
Asterisk/kamailio/freeswitch are likely to handle most of the above for the simple cases (1 to 1, audio). For anything complicated, you're on your own. You might want to look at respoke.io that was made by digium, the company behind asterisk.

Serverside WebRTC (streaming camera)

Use Case (stream UDP video)
Stream server-side web-cam (robot) UDP video to a client browser. We would rather lose packets than have the webcam struggle to keep up over a TCP connection via wifi which constantly cuts out.
Attempted solution
Start a Xvfb FireFox browser on the server and have that stream the webcam media source. I don't like this solution as it's not flexible for non webcam video and difficult to configure.
I'm looking for something that can stream an arbitrary media source to a WebRTC connection (including the greets & hand shaking). I don't particularly care which language it is, if something already exists in node.js, python, C, java or Scala I'll use it. Otherwise I suppose I'll get to work on the problem (in that case any guidance would be appreciated)

Is there an application-agnostic signaling protocol?

Is there an application-agnostic signaling protocol?
The use case is this. We have an open-source library for a multi-agent system that supports several protocols of the application layer of the OSI model. On the moment HTTP, XMPP, and ZeroMQ are supported for example. We would like to add high-bandwidth real-time streaming possibilities. It is logical to use RTP for that.
So, to recapitulate, we already have a connection to the other party that we can use for signalling. We want to negotiate only a new channel for data communication.
However, regarding the current standards, with respect to signaling all of them seem to be tied to their application. These current "standards" seem to be SIP, RTSP, and Jingle. They all seem to use RTP or SRTP on the application layer, and UDP on the transport layer. See e.g. XEP-0167.
The only thing we want to negotiate is another connection to that party that can be used for data transmission. In the Session Description Protocol all kind of stuff about media shows up, optional phone numbers, etc. If someone can point at a signaling protocol that is meant to be application-agnostic, that would be great!
I'm a big fan of XMPP and I think you'll get what you need with it. However since you already have HTTP as well, I want to mention that PubSubHubbub can also be used for that!
The current version of the protocol applies to any mime type that can be transported with HTTP so that would work.
In practice it's just a webhooks API which makes it easy to use and scale via load balancing.
Is there an application-agnostic signaling protocol?
Yes there are lots and you already mention a number of them such as XMPP, SIP and RTSP. You could also add the brand new WebRTC protocol to the list.
We would like to add high-bandwidth real-time streaming possibilities. It is logical to use RTP for that.
Yes. RTP is lightweight and as its name suggest was designed for carrying real-time traffic. It's also popular so you will be able to find numerous existing implementations.
The only thing we want to negotiate is another connection to that
party that can be used for data transmission. In the Session
Description Protocol all kind of stuff about media shows up, optional
phone numbers, etc. If someone can point at a signaling protocol that
is meant to be application-agnostic, that would be great!
I'm not sure what you mean here. Session Description Protocol (SDP) is a standard way to describe the media capabilities of a device. It's commonly used in SIP and RTSP (and XMPP has something equivalent) however it's separate from those protocols and if you don't want to use it you are free to come up with your own way of describing media.
You may be getting overwhelmed by some of the SDP examples, and they can indeed get very complicated when there are multiple streams and codecs offered. However an SDP payload can also be very simple; below is an SDP example for an RTSP server offering a single MJPEG video stream.
v=0
o=- - 0 IN IP4 0.0.0.0
s=-
t=0 0
m=video 0 RTP/AVP 26
If you just need a signalling protocol that is system and application agnostic, XMPP is the way to go.

Interoperability of SIP/H.323/IAX2

I am curious to know if interoperability exists between those three protocols. Like if a call originated from a SIP protocol can go through a H.323 protocol? An article or book link about this topic will be much appreciated.Thanks.
SIP, H.323 and IAX2 are all different protocols and are not directly interoperable. That is, you cannot connect a SIP phone to an H.323 device and make a call.
The problems these protocols solve are all similar (e.g. Make a voice or video call). Protocol converters and other devices (like gateways) are available and can do the conversion.
You may also have to transcode the audio and video data from one codec to another, but you may also have to do that on a SIP-SIP or H.323-H.323 call.
Many PBXes and SoftSwitches support both SIP and H.323: asterisk supports all 3 (SIP, H.323 and IAX2).

Resources