48

I have a C++ process running in the background that will be generating 'events' infrequently that a Python process running on the same box will need to pick up.

  • The code on the C side needs to be as lightweight as possible.
  • The Python side is read-only.
  • The implementation must be cross-platform.
  • The data being sent is very simple.

What are my options?

Thanks

2
  • 1
    What OS are your programs running under ? Pipe based IPC is very easy to implement on Unix-like machines. Commented Aug 2, 2011 at 16:42
  • "the C side" is a typo, right ? Commented Aug 8, 2018 at 11:56

7 Answers 7

62

zeromq -- and nothing else. encode the messages as strings.

However, If you want to get serialiazation from a library use protobuf it will generate classes for Python and C++. You use the SerializeToString() and ParseFromString() functions on either end, and then pipe the strings via ZeroMq.

Problem solved, as I doubt any other solution is faster, and neither will any other solution be as easy to wire-up and simple to understand.

If want to use specific system primitives for rpc such as named pipes on Windows and Unix Domain Sockets on unix then you should look at Boost::ASIO. However, unless you have (a) a networking background, and (b) a very good understanding of C++, this will be very time consuming

Sign up to request clarification or add additional context in comments.

6 Comments

+1 for multiple options. And pointing out that protobuf is only a solution for the serialization aspect.
I chose zeromq because the server-side implementation is 12 lines of code!! I don't like taking on dependencies if I don't have to but zeromq is the exception. :)
Yes zeromq is designed exactly for your use case. It is very primitive and very easy to understand. It's primitiveness is robust though, as you can implement more complex messaging constructs out of it. In my work I chose to implement my own RPC system on top of boost:ASIO since I needed the system primitives I mentioned above.
Zeromq is the worst. I have done exactly this with ZeroMQ and am now switching to anything else. ZeroMQ has no concept of failure at all. If you try to send a message and your process went down it would be impossible to tell. It would just continue trying to send forever. There are many other issues where failure is completely opaque, and thus retry is impossible also.
@ghostbust555 It's been a long time since I have worked with zeromq. "No concept of failure at all" in other words "fire and forget", there is nothing wrong with "fire and forget" messaging. Also you can build failure mechanics on top of zeromq if you need it. Having said that these days I might lean towards GRPC, but it does have quite a heavy python dependency footprint if I remember correctly.
|
5

Use zeromq, it's about as simple as you can get.

2 Comments

Nice project, with good documentation. Thanks for pointing this up!
Seems really great indeed. And it seems truly portable, flexible and fast.
5

Google's protobuf is a great library for RPC between programs. It generates bindings for Python and C++.

If you need a distributed messaging system, you could also use something like RabbitMQ, zeromq, or ActiveMQ. See this question for a discussion on the message queue libraries.

3 Comments

RabbitMq is a bazooka compared to ZeroMq which is a fly-swatter ;)
The OP didn't specify if a "bazooka" was needed, so I presented the one that I think is the most popular. I've edited my answer to include zeromq and ActiveMQ as well, and pointed to another SO question on that topic.
I think protobuf is just a serialization library for portable transportation of the message itself. It does not seem to provide any mechanism for RPC calls and IPC.
2

Another option is to just call your C code from your Python code using the ctypes module rather than running the two programs separately.

1 Comment

The C process is a daemon and is running in the background all the time.
1

How complex is your data? If it is simple I would serialize it as a string. If it was moderately complex I would use JSON. TCP is a good cross-platform IPC transport. Since you say that this IPC is rare the performance isn't very important, and TCP+JSON will be fine.

Comments

1

You can use Google GRPC for this

Comments

-3

I will say you create a DLL that will manage the communication between the two. The python will load DLL and call method like getData() and the DLL will in turn communicate with process and get the data. That should not be hard. Also you can use XML file or SQLite database or any database to query data. The daemon will update DB and Python will keep querying. There might be a filed for indicating if the data in DB is already updated by daemon and then Python will query. Of course it depends on performance and accuracy factors!

2 Comments

Because I don't want to fly to Japan to get Sushi when I can pick up the phone and get it delivered. :)
@Stefano The criticism is obvious... My downvote is for a non cross-platform solution (see tags).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.