{"id":4084,"date":"2025-11-24T11:12:24","date_gmt":"2025-11-24T10:12:24","guid":{"rendered":"https:\/\/hainzelman.com\/?post_type=faq&#038;p=4084"},"modified":"2025-11-24T11:12:25","modified_gmt":"2025-11-24T10:12:25","slug":"welche-ki-modelle-kann-ich-verwenden","status":"publish","type":"faq","link":"https:\/\/hainzelman.com\/en\/faq\/welche-ki-modelle-kann-ich-verwenden\/","title":{"rendered":"Which AI models can I use?"},"content":{"rendered":"<p>The system is <strong>model-agnostic.<\/strong>. Es erm\u00f6glicht die freie Wahl bei den eingesetzten KI-Modellen:&nbsp;<\/p>\n\n\n\n<p>\u2022 Propriet\u00e4re Modelle (z. B. OpenAI, Anthropic, Google).&nbsp;<\/p>\n\n\n\n<p>\u2022 Modelle, die in einer privaten Cloud (z. B. Azure EU) oder On-Premise gehostet werden.&nbsp;<\/p>\n\n\n\n<p>\u2022 Self-Hosting von Open-Source-Modellen wie Llama oder Gemma ist m\u00f6glich, wobei sichergestellt wird, dass keine Daten an die Ursprungskonzerne flie\u00dfen.<\/p>\n\n\n\n<p><\/p>","protected":false},"menu_order":0,"template":"","format":"standard","categories":[233],"class_list":["post-4084","faq","type-faq","status-publish","format-standard","hentry","category-technologie-flexibilitaet-und-integration"],"acpt":null,"acf":[],"_links":{"self":[{"href":"https:\/\/hainzelman.com\/en\/wp-json\/wp\/v2\/faq\/4084","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hainzelman.com\/en\/wp-json\/wp\/v2\/faq"}],"about":[{"href":"https:\/\/hainzelman.com\/en\/wp-json\/wp\/v2\/types\/faq"}],"wp:attachment":[{"href":"https:\/\/hainzelman.com\/en\/wp-json\/wp\/v2\/media?parent=4084"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hainzelman.com\/en\/wp-json\/wp\/v2\/categories?post=4084"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}