docker run -p 8080:8080 –name local-ai -ti localai\/localai:latest-aio-cpu<\/p>\n<\/blockquote>\n\n\n\n
Atau bila Anda menggunakan GPU dari Nvidia dan menggunakan cuda 12 bisa dengan menjalankan:<\/p>\n\n\n\n
\ndocker run -p 8080:8080 –gpus all –name local-ai -ti localai\/localai:latest-aio-gpu-nvidia-cuda-12<\/p>\n<\/blockquote>\n\n\n\n
Selain itu juga bisa menggunakan installer<\/em> yang disediakan pengembang.<\/p>\n\n\n\n\ncurl https:\/\/localai.io\/install.sh | sh<\/p>\n<\/blockquote>\n\n\n\n
Fitur Utama<\/h2>\n\n\n\n <\/figure>\n\n\n\nDari menu kita dapat memilih fitur utama yang bisa langsung digunakan bila kita sudah mengunduh model yang diinginkan. Fitur chat<\/em> sebagai asisten untuk berkonsultasi, generate images<\/em> bila ingin mengubah teks menjadi gambar, TTS bila ingin mengubah teks menjadi suara, talk<\/em> bila ingin berkonsultasi dengan asisten dengan berbicara.<\/p>\n\n\n\nSelain fitur utama tersebut, dokumentasi LocalAI dan bagaimana berdialog dengannya melalui API tersedia. Fitur lain yang mungkin perlu dieksplorasi bila ingin memperluas lingkup dan sumber daya adalah swarm<\/em>.<\/p>\n\n\n\n\n <\/figure>\n<\/figure>\n\n\n\nDukungan model yang diadopsi juga tak kalah menarik. Melalui menu models<\/em> dapat dilihat berbagai macam jenis model yang didukung yang saat ini di tulis terdapat 461 model. Untuk dapat menggunakannya pun mudah, tinggal klik pada install<\/em>.<\/p>\n\n\n\nUntuk eksplorasi lebih lanjut silakan ke halaman pengembangan.<\/p>\n","protected":false},"excerpt":{"rendered":"
\ud83e\udd16 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference URL: https:\/\/localai.io\/ LocalAI adalah alternatif OpenAI dan layanan AI lain yang […]<\/p>\n","protected":false},"author":1,"featured_media":36,"comment_status":"closed","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[26],"class_list":["post-35","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tools","tag-artificial-intelligence"],"_links":{"self":[{"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/posts\/35","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/comments?post=35"}],"version-history":[{"count":1,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/posts\/35\/revisions"}],"predecessor-version":[{"id":38,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/posts\/35\/revisions\/38"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/media\/36"}],"wp:attachment":[{"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/media?parent=35"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/categories?post=35"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/10.88.88.211\/logs-kalamangga-net\/index.php\/wp-json\/wp\/v2\/tags?post=35"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}