Facebook Twitter Instagram
    • About us
    • Contact us
    • Disclaimer
    • Privacy Policy
    Facebook Twitter Instagram
    National ToneNational Tone
    • India News
    • Politics
    • Sports
    • Technology
    • World News
    National ToneNational Tone

    OpenAI’s ChatGPT: It costs $100,000 to run per day and other interesting facts to keep in mind

    National ToneBy National ToneDecember 20, 2022No Comments3 Mins Read

    ChatGPT is an AI-powered chatbot, capable of creating interaction-style conversation. Developed by OpenAI, ChatGPT is known for its human-like responses and the best part about it is the fact that it is available for free. While you might be able to access ChatGPT for free, OpenAI is actually spending a lot of money to keep ChatGPT up and running.

    Here are some of the least-known facts about Open AI’s ChatGPT that gives us a hint regarding the various aspects of developing and running an AI service.

    The cost of running ChatGPT is $3 million per month

    According to the analysis, ChatGPT is hosted on Microsoft’s Azur cloud, so, OpenAI doesn’t have to buy a setup physical server room. As per the current rate, Microsoft charges $3 an hour for a single A100 GPU and each word generated on ChatGPT costs $0.0003.

    A response from ChatGPT will usually have at least 30 words, hence, a single response from ChatGPT will cost at least 1 cent to the company. Currently, it is estimated that OpenAI is spending at least $100K per day or $3 million per month on running costs.

    How many GPUs does it take to run ChatGPT? And how expensive is it for OpenAI? Let’s find out! 🧵🤑

    — Tom Goldstein (@tomgoldsteincs) December 6, 2022

    A single ChatGPT query uses at least 8 GPUs

    According to Goldstein, Associate Professor at Maryland, a single NVIDIA A100 GPU is capable of running a 3-billion parameter model in about 6ms. With this speed, a single NVIDIA A100 GPU could take 350ms seconds to print out just a single word on ChatGPT.

    Given ChatGPT’s latest version 3.5 has over 175 billion parameters, to get an output for a single query, it needs at least five A100 GPUs to load the model and text. ChatGPT is capable of outputting around 15-20 words per second, hence, it needs a server with at least 8 A100 GPUs.

    ChatGPT doesn’t have answers to all your questions

    While ChatGPT is currently the most capable AI chat boat, it is actually trained using models that are created on or before 2021. Hence, it might not be able to give you accurate responses to all the queries.

    ChatGPT has over one million users

    Within a few days of the official launch, ChatGPT has over 1 million users. While most of these might not be active users, the company has definitely managed to gather a lot of users in a limited time. However, the company has to do a lot more than this to retail all these users to make ChatGPT a profitable AI tool.



    Source link

    ChatGPT ChatGPT FAQs ChatGPT features ChatGPT news ChatGPT price ChatGPT running cost ChatGPT technology ChatGPT update costs Day facts interesting mind OpenAIs run
    National Tone
    • Website

    Related Posts

    Bengal panchayat polls: Probe filing of papers by nominee while being abroad, HC tells CID

    July 20, 2023

    Silambarasan sports a new look in Malaysia, pics and videos go viral

    July 17, 2023

    7 months after ‘accident’, police uncover Bengaluru man’s plot to kill pregnant wife as she refused divorce; 2 held

    July 17, 2023

    Leave A Reply Cancel Reply

    Type above and press Enter to search. Press Esc to cancel.