{"id":3151,"date":"2020-07-31T05:00:15","date_gmt":"2020-07-31T05:00:15","guid":{"rendered":"https:\/\/cloudxlab.com\/blog\/?p=3151"},"modified":"2020-08-03T14:37:23","modified_gmt":"2020-08-03T14:37:23","slug":"what-is-gpt3-and-will-it-take-over-the-world","status":"publish","type":"post","link":"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/","title":{"rendered":"What is GPT3 and will it take over the World"},"content":{"rendered":"\n<p>GPT3 is out in private beta and has been buzzing in social media lately. GPT3 has been made by Open AI, which was founded by Elon Musk, Sam Altman and others in 2015. Generative Pre-trained Transformer 3 (GPT3) is a gigantic model with 175 billion parameters. In comparison the previous version GPT2 had 1.5 billion parameters. The larger more complex model enables GPT3 to do things that weren&#8217;t previously possible. <\/p>\n\n\n\n<!--more-->\n\n\n\n<h2>Applications of GPT3<\/h2>\n\n\n\n<p>A better model opens up a lot of new possibilities. GPT-3 has been trained with 45TB of data gleaned from the internet. The applications of this model are immense. Twitter is abuzz with all initial results. <\/p>\n\n\n\n<h3>Medical Diagnosis<\/h3>\n\n\n\n<p>The model has been able to generate relevant medical text about obscure medical terms as seen in the tweet below. <\/p>\n\n\n\n<figure class=\"wp-block-embed-twitter wp-block-embed is-type-rich is-provider-twitter\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-dnt=\"true\" align=\"center\"><p lang=\"en\" dir=\"ltr\">So <a href=\"https:\/\/twitter.com\/OpenAI?ref_src=twsrc%5Etfw\">@OpenAI<\/a> have given me early access to a tool which allows developers to use what is essentially the most powerful text generator ever. I thought I\u2019d test it by asking a medical question. The bold text is the text generated by the AI. Incredible&#8230; (1\/2) <a href=\"https:\/\/t.co\/4bGfpI09CL\">pic.twitter.com\/4bGfpI09CL<\/a><\/p>&mdash; Qasim Munye (@QasimMunye) <a href=\"https:\/\/twitter.com\/QasimMunye\/status\/1278750809094750211?ref_src=twsrc%5Etfw\">July 2, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><script type=\"text\/javascript\">window.addEventListener(\"message\",function(e){\n                window.parent.postMessage(e.data,\"*\");\n            },false);<\/script>\n<\/div><\/figure>\n\n\n\n<h3>Programming<\/h3>\n\n\n\n<p>Another application that is driving significant interest is in its direct usage in software. The demo below shows how GPT-3 is able to interpret python code and tell what it is doing. This will be a major productivity boost as everyone knows how much programmers hate reading somebody else&#8217;s code. Imagine a tool that comments all of your available code, including those written by engineers who left your organisation long ago.<\/p>\n\n\n\n<figure class=\"wp-block-embed-twitter wp-block-embed is-type-rich is-provider-twitter\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-dnt=\"true\" align=\"center\"><p lang=\"en\" dir=\"ltr\">Reading code is hard! Don&#39;t you wish you could just ask the code what it does? To describe its functions, its types.<br><br>And maybe&#8230; how can it be improved?<br><br>Introducing: <a href=\"https:\/\/twitter.com\/replit?ref_src=twsrc%5Etfw\">@Replit<\/a> code oracle \ud83e\uddd9\u200d\u2640\ufe0f<br><br>It&#39;s crazy, just got access to <a href=\"https:\/\/twitter.com\/OpenAI?ref_src=twsrc%5Etfw\">@OpenAI<\/a> API and I already have a working product! <a href=\"https:\/\/t.co\/HX4MyH9yjm\">pic.twitter.com\/HX4MyH9yjm<\/a><\/p>&mdash; Amjad Masad (@amasad) <a href=\"https:\/\/twitter.com\/amasad\/status\/1285789362647478272?ref_src=twsrc%5Etfw\">July 22, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><script type=\"text\/javascript\">window.addEventListener(\"message\",function(e){\n                window.parent.postMessage(e.data,\"*\");\n            },false);<\/script>\n<\/div><\/figure>\n\n\n\n<p>GPT-3 has also been able to generate code based on text. The demo below shows JSX being generated based on text description. <\/p>\n\n\n\n<figure class=\"wp-block-embed-twitter wp-block-embed is-type-rich is-provider-twitter\"><div class=\"wp-block-embed__wrapper\">\n<blockquote class=\"twitter-tweet\" data-dnt=\"true\" align=\"center\"><p lang=\"en\" dir=\"ltr\">This is mind blowing.<br><br>With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you.<br><br>W H A T <a href=\"https:\/\/t.co\/w8JkrZO4lk\">pic.twitter.com\/w8JkrZO4lk<\/a><\/p>&mdash; Sharif Shameem (@sharifshameem) <a href=\"https:\/\/twitter.com\/sharifshameem\/status\/1282676454690451457?ref_src=twsrc%5Etfw\">July 13, 2020<\/a><\/blockquote><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><script type=\"text\/javascript\">window.addEventListener(\"message\",function(e){\n                window.parent.postMessage(e.data,\"*\");\n            },false);<\/script>\n<\/div><\/figure>\n\n\n\n<p>The demos above does seem mind blowing but it is not going to replace programmers. GPT3 can power assistants that help programmers get faster and more efficient. <\/p>\n\n\n\n<h2>Writers, bloggers and journalists<\/h2>\n\n\n\n<p>GPT-3 is able to churn out text with only a few words of input. It could be fed a series of financial results and it could make a short article sumarizing the results. I could in principle feed the algorithm a few blogs and articles about any topic say GPT-3 and it could give me another article summarising it. People are exploring whether GPT-3 can be used to build a tool that helps writers get over writer&#8217;s block. <\/p>\n\n\n\n<p>Check this <a href=\"https:\/\/www.gwern.net\/GPT-3\">site<\/a>. It demostrates creative writing by GPT-3 with poetry, articles, puns etc. <\/p>\n\n\n\n<h3>Customer Service<\/h3>\n\n\n\n<p>Chat bots have been used extensively by multiple companies. It is used mostly as a filter to incoming customer queries. With additional capabilities, we will see these chatbots becoming capable of solving more customer issues. <a href=\"https:\/\/www.cgsinc.com\/en\/resources\/2019-CGS-Customer-Service-Chatbots-Channels-Survey\">Studies<\/a> have shown that people prefer talking to humans than chatbots. With bots getting more realistic, more and more people would be unable to see the difference. Automated bot will manage an increasing proportion of the queries.<\/p>\n\n\n\n<h2>GPT3 under the hood<\/h2>\n\n\n\n<p><a href=\"https:\/\/arxiv.org\/pdf\/2005.14165.pdf\">The GPT3 paper &#8216;Language Models are few short Learners&#8217;<\/a> was released a little earlier. It gives a deeper insight into how the model was built. NLP models have done substantially well for a while. The other models are typically task agnostic. For specific tasks they need to be re-trained or fine tuned with large datasets. This is quite different from humans. For e.g. we can specify a human child 3 examples of act of kindness. She can tell whether the 4th example is an act of kindness or not. Language for humans is a few shot learning process. This is not true for most models. <\/p>\n\n\n\n<p>GPT3 has demonstrated outstanding NLP capabilities with only a few examples for a particular task. With its large model size and huge training dataset, it has built excellent in-context generalisation. <\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img width=\"805\" height=\"283\" src=\"https:\/\/blog.cloudxlab.com\/wp-content\/uploads\/2020\/07\/Screenshot-2020-07-31-at-7.34.41-AM.png\" alt=\"GPT3 in-context learning\" class=\"wp-image-3156\" \/><figcaption>The figure here shows multiple sequences. Each sequence has a different context. During the training process, the model learns the patterns in these sequences. This process is also called in-context learning.<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img width=\"760\" height=\"394\" src=\"https:\/\/blog.cloudxlab.com\/wp-content\/uploads\/2020\/07\/Screenshot-2020-07-31-at-7.39.38-AM.png\" alt=\"\" class=\"wp-image-3157\" \/><figcaption>The figure above shows that larger models build in-context generalisations better than smaller ones. <\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img width=\"633\" height=\"400\" src=\"https:\/\/blog.cloudxlab.com\/wp-content\/uploads\/2020\/07\/Screenshot-2020-07-31-at-8.47.20-AM.png\" alt=\"\" class=\"wp-image-3158\" \/><figcaption>Larger models have better zero-shot accuracy and even better few shot accuracy. Bigger models are better at in-context learning<\/figcaption><\/figure>\n\n\n\n<p>GPT3 has 175 billion parameters. In comparison GPT-2 had 1.5 billion, the Google and Facebook equivalents have around 10 billion. Humans have 85 billion neurons in our brains. GPT3 is quite big and the in-context learning slopes are quite steeper for larger models. <\/p>\n\n\n\n<h3>Size Does Matter<\/h3>\n\n\n\n<p>A large model means that small research teams or companies will not be able to implement this model for themselves. This <a href=\"https:\/\/venturebeat.com\/2020\/06\/01\/ai-machine-learning-openai-gpt-3-size-isnt-everything\/\">article<\/a> estimates that the training process would have required 350 GB memory and cost $12.6 million. Very few companies or researchers would have access to such resources. It also makes it difficult for them to compete. We don&#8217;t know what GPT4 or the response from the tech heavyweights like Google, Facebook will look like. Maybe this will trigger a size race.<\/p>\n\n\n\n<h2>GPT3 Release<\/h2>\n\n\n\n<p>Open AI has not open sourced this model. It is currently is active beta and if you are interested you can <a href=\"https:\/\/forms.office.com\/Pages\/ResponsePage.aspx?id=VsqMpNrmTkioFJyEllK8s0v5E5gdyQhOuZCXNuMR8i1UQjFWVTVUVEpGNkg3U1FNRDVVRFg3U0w4Vi4u\">sign<\/a> up for it. Open AI is obviously controlling the release process. The cherry picked snippets of it demos have created the buzz they intended to. It also gives Open AI to track the negative feedback and it unsavoury usage. A model this powerful is a great propaganda tool. Its capability to create content which is almost human like will be of interest to organizations aiming to influence public opinion. Twitter bots will seems almost human.<\/p>\n\n\n\n<p>We still don&#8217;t know what this API will be priced like when it releases. It is unlikely to be very cheap considering the amount of compute a 175 billion weighted model will take. Open AI is probably also looking at ways to monetize the research they are doing. For now we can only speculate. <\/p>\n\n\n\n<p>GPT3 is good beyond doubt. There are several downsides and scope for improvement. The authors of the paper have acknowledged the same. GPT3 has been trained with almost the entire internet. So it has also picked up the biases and prejudices that plague humanity. Like all technologies it can be used for good and bad. I am going to trust our better angels that it will be more good than bad.<\/p>\n","protected":false},"excerpt":{"rendered":"<p> GPT-3 is the largest NLP model till date. It has 175 billion parameters and has been trained with 45TB of data. The applications of this model are immense.<\/p>\n","protected":false},"author":26,"featured_media":3167,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[67,29,30],"tags":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v16.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>What is GPT3 and will it take over the World | CloudxLab Blog<\/title>\n<meta name=\"description\" content=\"GPT3 is out and it has created a buzz. This article will examine the model and try to seperate the fact from hype. GPT3 is the largest NLP model till date.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"What is GPT3 and will it take over the World | CloudxLab Blog\" \/>\n<meta property=\"og:description\" content=\"GPT3 is out and it has created a buzz. This article will examine the model and try to seperate the fact from hype. GPT3 is the largest NLP model till date.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/\" \/>\n<meta property=\"og:site_name\" content=\"CloudxLab Blog\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/cloudxlab\" \/>\n<meta property=\"article:published_time\" content=\"2020-07-31T05:00:15+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-08-03T14:37:23+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/blog.cloudxlab.com\/wp-content\/uploads\/2020\/07\/pexels-pixabay-373543.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2999\" \/>\n\t<meta property=\"og:image:height\" content=\"1999\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@CloudxLab\" \/>\n<meta name=\"twitter:site\" content=\"@CloudxLab\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\">\n\t<meta name=\"twitter:data1\" content=\"5 minutes\">\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/#website\",\"url\":\"https:\/\/cloudxlab.com\/blog\/\",\"name\":\"CloudxLab Blog\",\"description\":\"Learn AI, Machine Learning, Deep Learning, Devops &amp; Big Data\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":\"https:\/\/cloudxlab.com\/blog\/?s={search_term_string}\",\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/cloudxlab.com\/blog\/wp-content\/uploads\/2020\/07\/pexels-pixabay-373543.jpg\",\"contentUrl\":\"https:\/\/cloudxlab.com\/blog\/wp-content\/uploads\/2020\/07\/pexels-pixabay-373543.jpg\",\"width\":2999,\"height\":1999},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#webpage\",\"url\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/\",\"name\":\"What is GPT3 and will it take over the World | CloudxLab Blog\",\"isPartOf\":{\"@id\":\"https:\/\/cloudxlab.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#primaryimage\"},\"datePublished\":\"2020-07-31T05:00:15+00:00\",\"dateModified\":\"2020-08-03T14:37:23+00:00\",\"author\":{\"@id\":\"https:\/\/cloudxlab.com\/blog\/#\/schema\/person\/e2c5cc7b933ebd4b15f9b463dc7cf1b4\"},\"description\":\"GPT3 is out and it has created a buzz. This article will examine the model and try to seperate the fact from hype. GPT3 is the largest NLP model till date.\",\"breadcrumb\":{\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"item\":{\"@type\":\"WebPage\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/\",\"url\":\"https:\/\/cloudxlab.com\/blog\/\",\"name\":\"Home\"}},{\"@type\":\"ListItem\",\"position\":2,\"item\":{\"@id\":\"https:\/\/cloudxlab.com\/blog\/what-is-gpt3-and-will-it-take-over-the-world\/#webpage\"}}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/#\/schema\/person\/e2c5cc7b933ebd4b15f9b463dc7cf1b4\",\"name\":\"Praveen Pavithran\",\"image\":{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/cloudxlab.com\/blog\/#personlogo\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/03c8d253347dcf9e04ec550cd6144973?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/03c8d253347dcf9e04ec550cd6144973?s=96&d=mm&r=g\",\"caption\":\"Praveen Pavithran\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","_links":{"self":[{"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/posts\/3151"}],"collection":[{"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/users\/26"}],"replies":[{"embeddable":true,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/comments?post=3151"}],"version-history":[{"count":8,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/posts\/3151\/revisions"}],"predecessor-version":[{"id":3166,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/posts\/3151\/revisions\/3166"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/media\/3167"}],"wp:attachment":[{"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/media?parent=3151"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/categories?post=3151"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cloudxlab.com\/blog\/wp-json\/wp\/v2\/tags?post=3151"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}