Nā hale waihona puke i kākau ʻia ma Jupyter Notebook

neural-tangents

ʻO ka wikiwiki a me ka maʻalahi o ka Neural Network ma Python.
  • 2.1k
  • Apache License 2.0

GPEN

  • 2.0k

carefree-creator

Ua hui nā mea kilokilo AI me ka papa kaha kiʻi ʻole..
  • 2.0k
  • MIT

FinanceDatabase

He waihona kēia o 300.000+ hōʻailona i loaʻa nā Equities, ETFs, Funds, Indices, Currencies, Cryptocurrencies and Money Markets..
  • 2.0k
  • MIT

awesome-notebooks

Mākaukau e hoʻohana i ka ʻikepili & AI templates, i hoʻonohonoho ʻia e nā mea hana e hoʻomaka ai i kāu mau papahana a me nā huahana ʻikepili i nā minuke. 😎 paʻi ʻia e ke kaiāulu Naas..
  • 2.0k
  • BSD 3-clause "New" or "Revised"

zero-to-mastery-ml

ʻO nā haʻawina a pau no ka Zero to Mastery Machine Learning and Data Science..
  • 2.0k

TensorRT

PyTorch/TorchScript/FX compiler no NVIDIA GPU me ka TensorRT (e pytorch).
  • 2.0k
  • BSD 3-clause "New" or "Revised"

tensorflow-onnx

E hoʻohuli i nā hiʻohiʻona TensorFlow, Keras, Tensorflow.js a me Tflite i ONNX.
  • 2.0k
  • Apache License 2.0

100-pandas-puzzles

100 mau puʻupuʻu ʻikepili no nā pandas, mai ka pōkole a maʻalahi a hiki i ka paʻakikī (60% piha).
  • 2.0k
  • MIT

fma

FMA: He ʻikepili no ka ʻike mele.
  • 2.0k
  • MIT

kubric

ʻO kahi pipeline hoʻokumu ʻikepili no ka hoʻokumu ʻana i nā wikiō semi-realistic synthetic multi-object me nā hōʻike waiwai e like me nā masks segmentation instance, nā palapala ʻāina hohonu, a me ke kahe ʻana.
  • 2.0k
  • Apache License 2.0

gs-quant

ʻO ka mea hana Python no ke kālā nui.
  • 1.9k
  • Apache License 2.0

checklist

Ma waho aʻe o ka pololei: ʻO ka hoʻāʻo ʻana i nā ʻano NLP me ka CheckList.
  • 1.9k
  • MIT

SimCLR

Hoʻokō PyTorch o SimCLR: He Kaʻina Maʻalahi no ka Aʻo Kūʻē o nā Hōʻike Kiʻi (e nā sthalles).
  • 1.9k
  • MIT

FinMind

Open Data, more than 50 financial data. 提供超過 50 個金融資料(台股為主),每天更新 https://finmind.github.io/.
  • 1.9k
  • Apache License 2.0

Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. Meanwhile, we created a new branch to build a Tabular LLM.(我们分别统一了丰富的IFT数据(如CoT数据,目前仍不断扩充)、多种训练效率方法(如lora,p-tuning)以及多种LLMs,三个层面上的接口,打造方便研究人员上手的LLM-IFT研究平台。同时tabular_llm分支构建了面向表格智能任务的LLM。.
  • 1.9k
  • Apache License 2.0

CodeSearchNet

Nā ʻikepili, nā mea hana, a me nā hōʻailona no ke aʻo ʻana i ke code.
  • 1.9k
  • MIT

MEDIUM_NoteBook

ʻO ka waihona me nā puke puke o kaʻu mau pou ma Medium.
  • 1.9k
  • MIT

jellyfish

🪼 kahi waihona python no ka hana ʻana i ka hoʻohālikelike a me ka phonetic o nā kaula.
  • 1.9k
  • MIT

SfMLearner

He papa hana hoʻonaʻauao ʻaʻole mālama ʻia no ka ʻike hohonu a me ka manaʻo ego-motion mai nā wikiō monocular.
  • 1.9k
  • MIT

DeepLearningForNLPInPytorch

He IPython Notebook aʻo ma ke aʻo hohonu no ka hoʻoili ʻana i ka ʻōlelo kūlohelohe, me ka wānana hoʻolālā.
  • 1.9k
  • MIT

Andrew-NG-Notes

ʻO kēia ʻo Andrew NG Coursera Handwritten Notes..
  • 1.8k

simple-llm-finetuner

UI maʻalahi no ka LLM Model Finetuning.
  • 1.8k
  • MIT

NAB

ʻO ka hōʻailona anomaly Numenta.
  • 1.8k
  • GNU Affero General Public License v3.0

pymc-resources

Nā kumuwaiwai hoʻonaʻauao PyMC.
  • 1.8k
  • MIT

ecco

E wehewehe, e kālailai, a e nānā i nā hiʻohiʻona ʻōlelo NLP. Hoʻokumu pololei ʻo Ecco i nā hiʻohiʻona pili i loko o nā puke puke Jupyter e wehewehe ana i ke ʻano o nā kumu hoʻohālike ʻōlelo Transformer (e like me GPT2, BERT, RoBERTA, T5, a me T0).
  • 1.8k
  • BSD 3-clause "New" or "Revised"

stable-diffusion

  • 1.7k
  • GNU Affero General Public License v3.0

ganspace

Ke ʻike nei i nā mana GAN hiki ke wehewehe [NeurIPS 2020].
  • 1.7k
  • Apache License 2.0

pythoncode-tutorials

ʻO nā kumu aʻo Python Code.
  • 1.7k
  • MIT

chain-of-thought-hub

Hoʻohālikelike i ka mana noʻonoʻo paʻakikī o nā kumu hoʻohālike ʻōlelo nui me ke kaulahao noʻonoʻo.
  • 1.7k
  • MIT