Английская Википедия:Flux (machine-learning framework)

Материал из Онлайн справочника
Перейти к навигацииПерейти к поиску

Шаблон:Short description Шаблон:Infobox software

Flux is an open-source machine-learning software library and ecosystem written in Julia.[1][2] Its current stable release is vШаблон:Wikidata. It has a layer-stacking-based interface for simpler models, and has a strong support on interoperability with other Julia packages instead of a monolithic design.[3] For example, GPU support is implemented transparently by CuArrays.jl[4] This is in contrast to some other machine learning frameworks which are implemented in other languages with Julia bindings, such as TensorFlow.jl, and thus are more limited by the functionality present in the underlying implementation, which is often in C or C++.[5] Flux joined NumFOCUS as an affiliated project in December of 2021.[6]

Flux's focus on interoperability has enabled, for example, support for Neural Differential Equations, by fusing Flux.jl and DifferentialEquations.jl into DiffEqFlux.jl.[7][8]

Flux supports recurrent and convolutional networks. It is also capable of differentiable programming[9][10][11] through its source-to-source automatic differentiation package, Zygote.jl.[12]

Julia is a popular language in machine-learning[13] and Flux.jl is its most highly regarded machine-learning repository.[13] A demonstration[14] compiling Julia code to run in Google's tensor processing unit (TPU) received praise from Google Brain AI lead Jeff Dean.[15]

Flux has been used as a framework to build neural networks that work with homomorphic encrypted data without ever decrypting it.[16][17] This kind of application is envisioned to be central for privacy to future API using machine-learning models.[18]

Flux.jl is an intermediate representation for running high level programs on CUDA hardware.[19][20] It was the predecessor to CUDAnative.jl which is also a GPU programming language.[21]

See also

References

Шаблон:Reflist

Шаблон:Differentiable computing