
web.archive.org/web/20230320135416/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Preview meta tags from the web.archive.org website.
Linked Hostnames
1Thumbnail

Search Engine Appearance
https://web.archive.org/web/20230320135416/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Introducing LLaMA: A foundational, 65-billion-parameter language model
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
Bing
Introducing LLaMA: A foundational, 65-billion-parameter language model
https://web.archive.org/web/20230320135416/https:/ai.facebook.com/blog/large-language-model-llama-meta-ai
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
DuckDuckGo

Introducing LLaMA: A foundational, 65-billion-parameter language model
Today, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
General Meta Tags
14- titleIntroducing LLaMA: A foundational, 65-billion-parameter language model
- charsetutf-8
- referrerorigin-when-crossorigin
- descriptionToday, we’re releasing our LLaMA (Large Language Model Meta AI) foundational model with a gated release. LLaMA is more efficient and competitive with...
- viewportwidth=device-width, initial-scale=1
Open Graph Meta Tags
1- og:imagehttps://web.archive.org/web/20230320195909im_/https://scontent-sea1-1.xx.fbcdn.net/v/t39.2365-6/333095137_1286826058904423_4144395724304288774_n.png?_nc_cat=110&ccb=1-7&_nc_sid=ad8a9d&_nc_ohc=ZqtkZQoSH6kAX8OaeXi&_nc_ht=scontent-sea1-1.xx&oh=00_AfBo445OJc-FHVRShsCy-Eb0ZAD2DZVY2igHk9h136L1Qg&oe=641D51EB
Twitter Meta Tags
1- twitter:cardsummary
Link Tags
38- canonicalhttps://web.archive.org/web/20230320195909/https://ai.facebook.com/blog/large-language-model-llama-meta-ai/
- preloadhttps://web.archive.org/web/20230320195909/https://static.xx.fbcdn.net/rsrc.php/v3/yb/l/0,cross/qwZY-W2wxrT.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230320195909/https://static.xx.fbcdn.net/rsrc.php/v3/yU/l/0,cross/uGc2L7OIIyz.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230320195909/https://static.xx.fbcdn.net/rsrc.php/v3/yK/l/0,cross/Sr4rvkJAgyj.css?_nc_x=Ij3Wp8lg5Kz
- preloadhttps://web.archive.org/web/20230320195909/https://static.xx.fbcdn.net/rsrc.php/v3/y0/l/0,cross/3Tc81XMaGRS.css?_nc_x=Ij3Wp8lg5Kz
Links
42- https://web.archive.org/web/20230320195909/https://ai.facebook.com
- https://web.archive.org/web/20230320195909/https://ai.facebook.com/blog
- https://web.archive.org/web/20230320195909/https://ai.facebook.com/blog/ai-math-theorem-proving
- https://web.archive.org/web/20230320195909/https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b
- https://web.archive.org/web/20230320195909/https://ai.facebook.com/blog/dino-paws-computer-vision-with-self-supervised-transformers-and-10x-more-efficient-training