
JAX is a machine learning library that has gained popularity in the research community. Here's a breakdown of what it is and how it compares to other popular libraries like TensorFlow (TF) and PyTorch:
https://jax.readthedocs.io/en/latest/notebooks/quickstart.html
https://colab.research.google.com/github/google/jax/blob/main/docs/notebooks/quickstart.ipynb

번역 자료
https://jax-kr.readthedocs.io/ko/latest/JAX101/01-jax-basics-translated.html
튜토리얼 영상
https://github.com/gordicaleksa/get-started-with-JAX#tutorial-1-from-zero-to-hero
모두의 연구소
https://www.youtube.com/watch?v=wN7JxvHz2i4
Automatic Differentiation: At its core, JAX is a numerical computing library that provides automatic differentiation. This means it can compute gradients of functions, which is essential for training machine learning models.
Functional Programming: JAX promotes a functional programming style, which can make certain types of mathematical code clearer and more concise.
GPU and TPU Support: JAX can run on both GPUs and TPUs, making it suitable for high-performance computing tasks.
Just-In-Time Compilation: JAX uses XLA (Accelerated Linear Algebra) for just-in-time compilation, which can optimize and speed up computations.
Flexibility: JAX provides a lower-level interface than some other machine learning libraries, which can give researchers more flexibility to implement custom algorithms and models.
Performance: Due to its just-in-time compilation with XLA, JAX can offer performance benefits for certain tasks.
Functional Approach: For those who prefer or are familiar with functional programming, JAX's style can be more intuitive.
Research-Oriented: JAX is particularly popular in the research community where the need for custom, experimental setups is common.
Maturity: TensorFlow and PyTorch are more mature and have been around for longer. They have larger communities, more resources, and extensive ecosystems.
Ecosystem and Tools: TensorFlow and PyTorch have more built-in tools and utilities for various tasks (e.g., TensorBoard for visualization in TensorFlow). They also have more pre-trained models available.
Dynamic vs. Static Computation Graph: PyTorch uses a dynamic computation graph, which can be more intuitive for certain tasks. TensorFlow originally used a static graph but introduced "Eager Execution" for dynamic graph capabilities. JAX, on the other hand, operates more functionally and doesn't revolve around the concept of computation graphs in the same way.
Learning Curve: TensorFlow and PyTorch, due to their higher-level abstractions, might be easier for beginners to pick up. JAX might have a steeper learning curve, especially for those unfamiliar with functional programming.
Customizability: While all three libraries offer a good degree of customizability, JAX's lower-level approach can provide more flexibility for certain advanced use cases.
In conclusion, whether JAX or another library like TensorFlow or PyTorch is "better" depends on the specific use case, the user's familiarity with the libraries, and the desired level of abstraction. JAX offers unique advantages, especially for researchers and those looking for a functional approach to machine learning, but TensorFlow and PyTorch remain dominant players in the field with extensive ecosystems.