Frequently Asked Questions (FAQ)
This section addresses common questions and issues that users encounter when working with RxInfer. The FAQ is a living document that grows based on community feedback and common usage patterns.
General Questions
What is RxInfer?
RxInfer is a Julia package for automated Bayesian inference on factor graphs using reactive message passing. It provides an efficient, scalable framework for probabilistic programming with a focus on streaming data.
How does RxInfer compare to other probabilistic programming packages?
See our detailed comparison guide for a comprehensive analysis of RxInfer vs. other tools like Turing.jl, Stan, and PyMC.
Is RxInfer suitable for beginners?
Yes! RxInfer provides a user-friendly syntax through GraphPPL and comprehensive documentation. Start with the Getting Started guide and work through examples.
Installation and Setup
See Installation for details.
I'm getting dependency conflicts. What should I do?
Try Pkg.resolve()
to resolve conflicts. See Pkg.jl for more details.
Can I use RxInfer from Python?
Yes! See our guide on Using RxInfer from Python.
Model Specification
What's the difference between =
and :=
in model specification?
=
is a regular Julia assignment operator, use it only for regular Julia variables:=
creates a random variable node, use it to create latent variables in your model
See Sharp Bits: Using =
instead of :=
for details.
How do I handle missing/incomplete data?
RxInfer supports missing data through the missing
value in Julia. The inference engine will automatically handle missing observations. See Missing Data for details.
How do I create custom nodes and message update rules?
See Custom Node and Rules for detailed guidance on extending RxInfer.
Inference Issues
I'm getting "Rule not found" errors. What does this mean?
This error occurs when RxInfer can't find appropriate message update rules for your model. See Rule Not Found Error for solutions.
"Stack overflow in inference"
See Stack Overflow during inference for more details.
My inference is running very slowly. How can I improve performance?
Check our Performance Tips section for optimization strategies.
How do I debug inference problems?
Check out the Debugging guide.
Performance and Scaling
How large can my models be?
RxInfer can handle models with millions of latent variables. Performance depends on:
- Model complexity (e.g. simple models with conjugate pairs of distributions are the fastest)
- Available memory (large models require more memory)
- Computational resources (more cores, more memory, faster CPU, etc.)
- Optimization techniques used (see Performance Tips)
Can I use RxInfer for real-time applications?
Yes! RxInfer is designed for real-time inference with reactive message passing. See our streaming inference documentation.
Community and Support
Where can I get help?
- Documentation: Start with the relevant sections in the User Guide
- GitHub Discussions: Ask questions and share experiences
- Issues: Report bugs and request features
- Community Meetings: Join regular public discussions, more info here
How can I contribute?
See our Contributing Guide for ways to help. Any help is welcome!
Contributing to the FAQ
This FAQ grows through community contributions! If you have questions that aren't covered here:
- Check existing discussions on GitHub
- Ask your question in GitHub Discussions
- Consider contributing the answer back to this FAQ
- Open an issue if you find a documentation gap
How to add questions to the FAQ
- Open a discussion or issue with your question
- If it's a common question, consider adding it here
- Follow the Contributing to Documentation guide
- Use clear, concise language and include code examples when helpful
Note: This FAQ is maintained by the community. For the most up-to-date information, check GitHub discussions and issues. If you find outdated information, please help us keep it current!