Last active
November 26, 2024 22:51
-
-
Save raihan71/a5570cdf9d87095e68e9eb1e678da839 to your computer and use it in GitHub Desktop.
Revisions
-
raihan71 revised this gist
Nov 26, 2024 . 1 changed file with 3 additions and 3 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -96,9 +96,9 @@ LLM could take inserted parameter more than millinions parameter that's why it c How it works the llms take parameter then breaking down sentences to understand the structure and meaning of each component.Named Entity Recognition (NER): Identifying and classifying key entities (e.g., people, places, dates) within text.Sentiment Analysis: Determining the sentiment or emotion expressed in text. And there's two of llm: base llm, instructed-tuned llm; fun fact, eliza the first chatbot implementation the model base it's just a set of pre-defined rules, essentially a pattern-matching system, that allows the program to identify keywords within a user's input and generate responses by substituting those keywords with pre-written phrases. And it;s kinda interesting this approach generating hallucination of the understanding. It;s kinda interesting as well how everything labeled as ai these days, haha even the program only like if else haha.. but those actually true i mean indeed the ai itself has if else condition and obviously is not just that modern ai has bunch of complex functions, neural network and etc. nevertheless, as a human, the ai has the special place in our heart, we still amazed how the ai evolve into something more human-like when we interact and we fall in love with that. That feeling is called eliza effect, when people think a computer or robot really understands them and has feelings, even though it's just following a program. -
raihan71 revised this gist
Nov 21, 2024 . 1 changed file with 11 additions and 8 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -51,7 +51,7 @@ As we entered the 2000s, computers became even smaller and more portable. Over time, with advances in processing power and the introduction of the internet, computers became capable of handling enormous amounts of data. Fast forward to this day, and we’re in an era where AI, especially with large language models, can interpret and generate human-like responses, opening up possibilities we couldn’t have imagined before. If we see the patterns it's connecting all the dots somehow, it all connected it has relation to each other. @@ -89,13 +89,13 @@ In essence, NLP is what enables generative AI models to understand prompts and p ## Second chapter: LLMs LLM could take inserted parameter more than millinions parameter that's why it could recevied instruction with such complex languages or inputs. #### Natural Language Processing (NLP):Semantic Parsing How it works the llms take parameter then breaking down sentences to understand the structure and meaning of each component.Named Entity Recognition (NER): Identifying and classifying key entities (e.g., people, places, dates) within text.Sentiment Analysis: Determining the sentiment or emotion expressed in text. And there's two of llm: base llm, instructed-tuned llm; fun fact, eliza the first chatbot implementation the model base it's just a set of pre-defined rules, essentially a pattern-matching system, that allows the program to identify keywords within a user's input and generate responses by substituting those keywords with pre-written phrases. It;s kinda interesting how everything feels kinda like ai these days, haha ive seen so many memes about the ai how it works if else condition, nested if else haha.. but those actually true i mean indeed the ai itself has if else condition and obviously is not just that modern ai has bunch of complex functions, neural network and etc. @@ -107,15 +107,18 @@ So, to run those models what you need first is capable device like minimum with ## Third Chapter: Prompting ok, so here's the part that we've been waiting for, let's the party begin we're going to dimistifying how to leveraging prompting for our coding productivy. one important key to agree here, "what do you think makes a good developer?" Take a moment to think about it. Is it techinal skills? or maybe knack for innnovation? Problem-solving ability? that's right one of that makes a good developer is having excellent problem solving, how developers usually problem solving when we find the erros?. Googling that's right searching... haha. In Prompt engineering is actually quite similar. Instead of Googling, we’re ‘prompting’ a model, crafting specific inputs to guide the AI toward the response we’re looking for. So, I encourage you to think of prompt engineering as an extension of a skill you’re already familiar with 'good at searching'. It’s about refining how we ask questions and strategically guiding the AI, in much the same way you would refine a search. With the right prompts, we can unlock the potential of AI to streamline our work and amplify our capabilities as developers. so far so great? do you guys agree with that say 'agreed' ### Prompting Technique , just like any other stuff, we have frameworks in software development,think like scrum for agile workflow, or mvc for structuring applications, prompt engineering also has its own set of frameworks designed to make prompting more effective and consistent its more like a technique.. For me when you prompting is like when learn language especially using english, if you take look llms to existing resources right now, english is the only option for complete or complex response if you comparing with other languages due to limited resources trained data i guess, but hopefully in near future maybe we can use any other language for prompt with the good response as good as using english language. alright, the first technique we could use is: -
raihan71 revised this gist
Nov 21, 2024 . 1 changed file with 1 addition and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -161,6 +161,7 @@ Web designers will be equipped with AI tools to simplify the design process, acc in summary (see slide) So, i guess in the near future developers will less care about comparing techstack like technilcal over debate like which framework is good for you react, angular, so on so forth. What they really care how fast you can generate of code yet accurate within 99% and prompt engineering itself will always evolve or maybe we dont need the technique at all in the future cause the model already like even more smarter like we only give little input then the model already known for that, so yeah let's see... so i think that's all end of talk, thankyou to joining my session once again, see you around peace enjoy the conference -
raihan71 revised this gist
Nov 20, 2024 . 1 changed file with 17 additions and 27 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -10,7 +10,7 @@ Thankyou for the organizer for the opportunity back to the conference, and ofc t so yeah, let's dive in to prompt engineering for web developers as you know my name is raihan nismara, you call me raihan i'm from indonesia, currently i'm assigned to front-side front end web team under telkom group indonesia you can also always connect with me feel free to follow, ask me anything i'm also running a podcast as well for fun you can try to listen if you like to. @@ -22,16 +22,14 @@ so, here are list the main topics we're going to covered up for 40 minutes ahead alright. the first point: what is prompt engineering? we also going to delve the ai history and other stuff of the ai itself. Over the past few years, we’ve seen software development rapidly evolve, with AI transforming the way we write and generate code. Traditionally, developers worked line by line, defining every detail explicitly. But today, thanks to advancements in artificial intelligence and natural language processing, we’re entering an era where we can guide our applications with high-level instructions, or prompts, aka human languages. This approach is called Prompt Engineering, (see slide) and it's actually reshaping how we think about development. ## First chapter: ai introduction before dive deep more further, we might wondering what's is going with the software development rigth now, i mean when this all happened everything the ai, prompt, just suddenly appread out of nowehere. When did AI and prompts start impacting our work in such a big way? let's take a thousand steps back for that, the answer it all started from,the answer is... from greece history, where philosophers debates ideas and sought wisdom... let's explain it one by one, but, oh actually, that's a bit too far from there, we don't have time. so, Let's fast forward a few thousands years, and let's start with... when the first computer was invented. ### History Computers @@ -49,13 +47,13 @@ By the 1970s, the integrated circuit had enabled the creation of the microproces - The Internet Age: 1990s - The Mobile and Cloud Computing Revolution: 2000s-2010s As we entered the 2000s, computers became even smaller and more portable. Over time, with advances in processing power and the introduction of the internet, computers became capable of handling enormous amounts of data. Fast forward to this day, and we’re in an era where AI, especially with large language models, can interpret and generate human-like responses, opening up possibilities we couldn’t have imagined. If we see the patterns it's connecting all the dots somehow, it all connected it has relation to each other. ### History AI @@ -64,24 +62,16 @@ Same thing like computers, ai has evolved over the years. - Foundation of ai was in the 1940s-1950s, where the first articial neurons we conceptualised and introduced by Warren McCulloch and Walter Pitts. in 1950s Alan introduced the world to the Turing Test, it was a framework to discern intelligent machines, setting the wheels for the giant computer or computer first generation. - Six years later, in 1956, a group of visionaries convened at the Dartmouth Conference hosted by John McCarthy, where the term “Artificial Intelligence” was first coined, setting the stage for decades of innovation. - Early development, in late 60s and 70s the first nlp integrated in computer application or we can say it the first chatbot implementations it was built at MIT by Joseph Weizenbaum It worked by recognizing keywords in a user's statement and then reflecting them back in the form of simple phrases or questions, For instance, if a user input, "I feel sad," ELIZA might respond with "Why do you feel sad?" This approach created the illusion of understanding. - The 1980s were a period of both strife and regeneration for the AI community. The decade kicked off with reduced funding, marking that age with the ‘AI Winter.’ - 1990s: Revival and Emergence of Machine Learning, Earlier, in 1996, the LOOM project came into existence, exploring the realms of knowledge representation and laying down the pathways for the meteoric rise of generative AI in the ensuing years. - 2000s: The Genesis of Generative AI As we rolled into the new millennium, - 2010s: Rise of AI and Breakthroughs and the AI landscape has been evolved becoming more complex and, in many ways, more specialized. and the AI landscape isn’t just one big field these days; it’s grown into a vast ecosystem with many specialized branches, or subsets, from computer vision to natural language processing (NLP), robotics, and, of course, large language models. but we're only going to foucs with two here: (see slide) And behind of prompt there's the model that big company trained like chatgpt, if you take a look how it build was procced with so many subsets and been going on with the process how model tranied. @@ -92,11 +82,11 @@ so, gen ai is part of one of the algorithm in machine learning or you could say and the technique of deep learning there's two: 1. discriminative, 2. generative - discriminative : here's how it works it classify first then it will deciding a cat or a dog - generative : the way it works it will generate new things for example image of a cat that's why based definition generative ai it self is to generate something new or making decision with the existing and it was designed to understand human naturally. and if you wondering what is the part of nlp in llm, if you take a look in the slide nlp is a branch of ai that focusing or enabling computer to understand, interpret and produce human language. In essence, NLP is what enables generative AI models to understand prompts and produce meaningful responses. It's like the "brain" behind the language skills of an LLM, allowing it to parse and generate text in a way that feels natural and relevant to us. ## Second chapter: LLMs LLM could take inserted parameter more than millinions parameter that's why it could recevied instruction with such complex languages or inputs. That's why with such a vast number of parameters, running these models requires a tremendous amount of processing power. That’s where specialized hardware, like GPUs, comes in. Unlike regular CPUs, GPUs are built to handle the parallel processing needed for deep learning tasks. When LLMs process text, they rely on GPUs (or even TPUs) to manage the heavy computational load involved in understanding and generating complex language patterns. @@ -106,7 +96,7 @@ This demand for powerful hardware has created a ripple effect, especially as LLM How it works the llms take parameter then breaking down sentences to understand the structure and meaning of each component.Named Entity Recognition (NER): Identifying and classifying key entities (e.g., people, places, dates) within text.Sentiment Analysis: Determining the sentiment or emotion expressed in text. And there's two of llm: base llm, instructed-tuned llm; fun fact, eliza the first chatbot implementation the model base it's just a set of pre-defined rules, essentially a pattern-matching system, that allows the program to identify keywords within a user's input and generate responses by substituting those keywords with pre-written phrases. It;s kinda interesting how everything feels kinda like ai these days, haha ive seen so many memes about the ai how it works if else condition, nested if else haha.. but those actually true i mean indeed the ai itself has if else condition and obviously is not just that modern ai has bunch of complex functions, neural network and etc. nevertheless, as a human, the ai has the special place in our heart, we still amazed how the ai evolve into something more human-like when we interact and we fall in love with that. -
raihan71 revised this gist
Nov 20, 2024 . 1 changed file with 6 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -168,4 +168,10 @@ The future of web development is intrinsically linked with Artificial Intelligen Web designers will be equipped with AI tools to simplify the design process, accurately predict user behavior, and personalize the user experience. These advancements will enable designers to focus on creativity and innovation, further enhancing the digital landscape. ### summary in summary (see slide) So, i guess in the near future developers will less care about comparing techstack like technilcal over debate like which framework is good for you react, angular, so on so forth. What they really care how fast you can generate of code yet accurate within 99% so i think that's all end of talk, thankyou to joining my session once again, see you around peace enjoy the conference -
raihan71 revised this gist
Nov 19, 2024 . 1 changed file with 3 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -166,3 +166,6 @@ There a lots of coding assistants out there let's spill it one by one, what do y So, we’ve explored how coding assistants like Copilot, ChatGPT, and others are transforming the way we write code, and we’ve touched on how mastering prompt techniques can amplify their effectiveness. But let’s take a moment to think about where all of this is heading. The future of web development is intrinsically linked with Artificial Intelligence. As AI becomes more sophisticated, we can anticipate more websites incorporating AI-driven functionality into their web design. Web designers will be equipped with AI tools to simplify the design process, accurately predict user behavior, and personalize the user experience. These advancements will enable designers to focus on creativity and innovation, further enhancing the digital landscape. ### summary So, i guess in the near future developers will less care about comparing techstack like technilcal over debate like which framework is good for you react, angular, so on so forth. What they really care how fast you can generate of code yet accurate within 99% -
raihan71 revised this gist
Nov 19, 2024 . 1 changed file with 2 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -164,4 +164,5 @@ There a lots of coding assistants out there let's spill it one by one, what do y ## Almost Last Chapter: Future Development So, we’ve explored how coding assistants like Copilot, ChatGPT, and others are transforming the way we write code, and we’ve touched on how mastering prompt techniques can amplify their effectiveness. But let’s take a moment to think about where all of this is heading. The future of web development is intrinsically linked with Artificial Intelligence. As AI becomes more sophisticated, we can anticipate more websites incorporating AI-driven functionality into their web design. Web designers will be equipped with AI tools to simplify the design process, accurately predict user behavior, and personalize the user experience. These advancements will enable designers to focus on creativity and innovation, further enhancing the digital landscape. -
raihan71 revised this gist
Nov 18, 2024 . 1 changed file with 2 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -163,3 +163,5 @@ There a lots of coding assistants out there let's spill it one by one, what do y - sqlai ## Almost Last Chapter: Future Development So, we’ve explored how coding assistants like Copilot, ChatGPT, and others are transforming the way we write code, and we’ve touched on how mastering prompt techniques can amplify their effectiveness. But let’s take a moment to think about where all of this is heading. And in future web developments will more linked to the ai, even from now on it already been linked -
raihan71 revised this gist
Nov 17, 2024 . 1 changed file with 9 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -154,4 +154,12 @@ Let's say we tired of chatbots we just want something more in-action help suppor There a lots of coding assistants out there let's spill it one by one, what do you will be the first one? - yes, github copilot developed by github and openai, initial release 2021, - tabnine, funfact tabnine actually the first pioneer in gen ai for software development or we can say tabnine actually the first coding assitant was launched in 2018. - replit - cursor.sh - amazon codewhisperer - snyk.io deepcode-ai - marscode - cody - sqlai ## Almost Last Chapter: Future Development -
raihan71 revised this gist
Nov 17, 2024 . 1 changed file with 13 additions and 2 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -124,6 +124,7 @@ Googling that's right searching... haha In Prompt engineering is actually quite similar. Instead of Googling, we’re ‘prompting’ a model, crafting specific inputs to guide the AI toward the response we’re looking for. Just like with Google, the way we phrase our prompts can dramatically impact the quality of the output we get back. So, I encourage you to think of prompt engineering as an extension of a skill you’re already familiar with 'good at searching'. It’s about refining how we ask questions and strategically guiding the AI, in much the same way you would refine a search. With the right prompts, we can unlock the potential of AI to streamline our work and amplify our capabilities as developers. so far so great? do you guys agree with that say 'agreed' ### Prompting Technique ok, so here's the part that we've been waiting for, just like any other stuff, we have frameworks in software development,think like scrum for agile workflow, or mvc for structuring applications, prompt engineering also has its own set of frameworks designed to make prompting more effective and consistent its more like a technique.. For me when you prompting is like when learn language especially using english, if you take look llms to existing resources right now, english is the only option for complete or complex response if you comparing with other languages due to limited resources trained data i guess, but hopefully in near future maybe we can use any other language for prompt with the good response as good as using english language. @@ -142,5 +143,15 @@ alright, the first technique we could use is: - 4th grader: this one, is my favourite, a method used in prompt to ask model explain complex stuff into understandable or simple logic as simple as how 4th grade children also understand. other than that, we can also set in different format meaning when you want your llms or the model setting in oustanding format not only just text but also in more clean or well-structured we can ask with different format: - tabular format: prompt engineering tabular format refers to structuring information in a table format - summarize with bullets: A method where you instruct a model to condense a large piece of text or information into a shorter, more concise version. The goal is to capture the key points or main ideas without losing the essential meaning. let's say we still don't where to start with these techniques but we just want to explore prompt that already written by anyone else aka we just want some templates then modify right away, the answer is that we can leverage prompt templates: - promp templates: here are the lists of collection templates pack that you can explore yourself from many categories; tech, content creator, marketing, so and so forth. ### Coding Assitant Let's say we tired of chatbots we just want something more in-action help support for our coding, like coding assistant. There a lots of coding assistants out there let's spill it one by one, what do you will be the first one? - yes, github copilot developed by github and openai, initial release 2021, - tabnine, funfact tabnine actually the first pioneer in gen ai for software development or we can say tabnine actually the first coding assitant was launched in 2018. - -
raihan71 revised this gist
Nov 17, 2024 . 1 changed file with 9 additions and 5 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -135,8 +135,12 @@ alright, the first technique we could use is: - few-shot: this technique is where the model given a few of examples of the task or context qquiet similar to in-context learning but this more common - chain of thought: this echnique in natural language processing that encourages model to generate more detailed and reasoned responses by articulating their reasoning process step-by-step. - react: This technique focuses on how LLMs “reason” through the input they receive (the prompt), and then “act” by producing text or performing a specific task based on that reasoning. - iteration: a method used in programming large language models LLMs to iteratively refine the generated responses or actions based on feedback, further instructions - comparsions: A method where the user ask a model two or more entities, concepts based on specific criteria. It can be useful in situations that require decision-making or analysis of multiple options. - critique me: ask the model to get the feedback from your input asap or based in your context you requested - laddering: breaking down the question into pieces one by one make easy to get what you want one step at a time - 4th grader: this one, is my favourite, a method used in prompt to ask model explain complex stuff into understandable or simple logic as simple as how 4th grade children also understand. other than that, we can also set in different format meaning when you want your llms or the model setting in oustanding format not only just text but also in more clean or well-structured we can ask with different format: - tabular format: - summarize with bullets: -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 11 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -129,4 +129,14 @@ For me when you prompting is like when learn language especially using english, alright, the first technique we could use is: - in-contenxt learning: In-context learning is a technique used by LLMs, where the model learns and generalizes from the examples provided in the input (the context). - No-Context Learning: unlikey in-contenxt no-context is the opposite of in-context, no-context is similar to zero-shot prompting, the model is given only the task without any spesific example or context: this technique used by most of poeople i guess - persona: where the model given spesific persona or set of characteristics, behaviour or roles. For example role as, as if - zero-shot: so same thing with no-context learning - few-shot: this technique is where the model given a few of examples of the task or context qquiet similar to in-context learning but this more common - chain of thought: this echnique in natural language processing that encourages model to generate more detailed and reasoned responses by articulating their reasoning process step-by-step. - react: This technique focuses on how LLMs “reason” through the input they receive (the prompt), and then “act” by producing text or performing a specific task based on that reasoning. - iteration: - comparsions: - critique me: - laddering: - 4th grader: this one, is my favourite -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 2 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -128,4 +128,5 @@ ok, so here's the part that we've been waiting for, just like any other stuff, w For me when you prompting is like when learn language especially using english, if you take look llms to existing resources right now, english is the only option for complete or complex response if you comparing with other languages due to limited resources trained data i guess, but hopefully in near future maybe we can use any other language for prompt with the good response as good as using english language. alright, the first technique we could use is: - in-contenxt learning: In-context learning is a technique used by LLMs, where the model learns and generalizes from the examples provided in the input (the context). - No-Context Learning: unlikey in-contenxt no-context is the opposite of that no-context is similar to zero-shot prompting -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 5 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -124,4 +124,8 @@ Googling that's right searching... haha In Prompt engineering is actually quite similar. Instead of Googling, we’re ‘prompting’ a model, crafting specific inputs to guide the AI toward the response we’re looking for. Just like with Google, the way we phrase our prompts can dramatically impact the quality of the output we get back. So, I encourage you to think of prompt engineering as an extension of a skill you’re already familiar with 'good at searching'. It’s about refining how we ask questions and strategically guiding the AI, in much the same way you would refine a search. With the right prompts, we can unlock the potential of AI to streamline our work and amplify our capabilities as developers. so far so great? do you guys agree with that say 'agreed' ok, so here's the part that we've been waiting for, just like any other stuff, we have frameworks in software development,think like scrum for agile workflow, or mvc for structuring applications, prompt engineering also has its own set of frameworks designed to make prompting more effective and consistent its more like a technique.. For me when you prompting is like when learn language especially using english, if you take look llms to existing resources right now, english is the only option for complete or complex response if you comparing with other languages due to limited resources trained data i guess, but hopefully in near future maybe we can use any other language for prompt with the good response as good as using english language. alright, the first technique we could use is: - in-contenxt learning: -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 7 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -118,4 +118,10 @@ So, to run those models what you need first is capable device like minimum with ## Third Chapter: Prompting Ok, let's the party begin we're going to dimistifying how to leveraging prompting for our coding productivy, let's go back to the definition once again with simpler mapping from each pharse the prompt is the input parameter that we send to the llms could be text, image and so much more. And the engineering is the practice of science to solve technical problems with knowledge or certain methods. So the prompt engieering basically is practice of knowledge to help crafting what we input what we request to the llms, within certain, best practices and set some rules inside. one important key to agree here, "what do you think makes a good developer?" Take a moment to think about it. Is it techinal skills? or maybe knack for innnovation? Problem-solving ability? that's right one of that makes a good developer is having excellent problem solving, how developers usually problem solving when we find the erros? Googling that's right searching... haha In Prompt engineering is actually quite similar. Instead of Googling, we’re ‘prompting’ a model, crafting specific inputs to guide the AI toward the response we’re looking for. Just like with Google, the way we phrase our prompts can dramatically impact the quality of the output we get back. So, I encourage you to think of prompt engineering as an extension of a skill you’re already familiar with 'good at searching'. It’s about refining how we ask questions and strategically guiding the AI, in much the same way you would refine a search. With the right prompts, we can unlock the potential of AI to streamline our work and amplify our capabilities as developers. so far so great? do you guys agree with that say 'agreed' ok, so here's the part that we've been waiting for, just like any other stuff, we have frameworks in software development,think like scrum for agile workflow, or mvc for structuring applications, prompt engineering also has its own set of frameworks designed to make prompting more effective and consistent its more like a technique.. -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 2 additions and 2 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -117,5 +117,5 @@ So, to run those models what you need first is capable device like minimum with ## Third Chapter: Prompting Ok, let's the party begin we're going to dimistifying how to leveraging prompting for our coding productivy, let's go back to the definition once again with simpler mapping from each pharse the prompt is the input parameter that we send to the llms could be text, image and so much more. And the engineering is the practice of science to solve technical problems with knowledge or certain methods. So the prompt engieering basically is practice of knowledge to help crafting what we input what we request to the llms, within certain, best practices and set some rules inside. one important key to agree here, "what do you think makes a good developer?" Take a moment to think about it. Is it techinal skills? or maybe knack for innnovation? -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 4 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -115,3 +115,7 @@ That feeling is called eliza effect, when people think a computer or robot reall Just like the other program llms also can be run as program in your local machine computer, for example these are the big tree models that are popular right now (see the slide). Other than that we also have (see the slide) So, to run those models what you need first is capable device like minimum with 8gb of ram, i think 8gb is not enough like mine then we can start with ollama desktop. You can choose the model earlier on ollama you can go to https://ollama.com/search ## Third Chapter: Prompting Ok, let's the party begin we're going to dimistifying how to leveraging prompting for our coding productivy, let's go back to the definition once again with simpler mapping from each pharse the prompt is the input parameter that we send to the llms could be text, image and so much more. And the engineering is the practice of science to solve technical problems with knowledge or certain methods. So the prompt engieering basically is practice of knowledge to help crafting what we input what we request to the llms, within certain, best practices and set rules inside. -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 1 addition and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -113,5 +113,5 @@ nevertheless, as a human, the ai has the special place in our heart, we still am That feeling is called eliza effect, when people think a computer or robot really understands them and has feelings, even though it's just following a program. Just like the other program llms also can be run as program in your local machine computer, for example these are the big tree models that are popular right now (see the slide). Other than that we also have (see the slide) So, to run those models what you need first is capable device like minimum with 8gb of ram, i think 8gb is not enough like mine then we can start with ollama desktop. You can choose the model earlier on ollama you can go to https://ollama.com/search -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 3 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -112,4 +112,6 @@ ive seen so many memes about the ai how it works if else condition, nested if el nevertheless, as a human, the ai has the special place in our heart, we still amazed how the ai evolve into something more human-like when we interact and we fall in love with that. That feeling is called eliza effect, when people think a computer or robot really understands them and has feelings, even though it's just following a program. Just like the other program llms also can be run as program in your local machine computer, for example these are the big tree models that are popular right now (see the slide). Other than that we also have (see the slide) So, to run those models what you need first is capable device like minimum with 8gb of ram, i think 8gb is not enough like mine then we can start with ollama desktop -
raihan71 revised this gist
Nov 14, 2024 . 1 changed file with 1 addition and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -112,4 +112,4 @@ ive seen so many memes about the ai how it works if else condition, nested if el nevertheless, as a human, the ai has the special place in our heart, we still amazed how the ai evolve into something more human-like when we interact and we fall in love with that. That feeling is called eliza effect, when people think a computer or robot really understands them and has feelings, even though it's just following a program. Just like the other program llms also can be run as program in your local machine computer, for example these are the big tree models that are popular -
raihan71 revised this gist
Nov 14, 2024 . No changes.There are no files selected for viewing
-
raihan71 revised this gist
Nov 13, 2024 . 1 changed file with 7 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -105,4 +105,11 @@ This demand for powerful hardware has created a ripple effect, especially as LLM #### Natural Language Processing (NLP):Semantic Parsing How it works the llms take parameter then breaking down sentences to understand the structure and meaning of each component.Named Entity Recognition (NER): Identifying and classifying key entities (e.g., people, places, dates) within text.Sentiment Analysis: Determining the sentiment or emotion expressed in text. And there's two of llm: base llm, instructed-tuned llm; fun fact, eliza the first chatbot implementation the model base it's just a set of pre-defined rules, essentially a pattern-matching system, that allows the program to identify keywords within a user's input and generate responses by substituting those keywords with pre-written phrases, For example, if a user input, "I feel sad," ELIZA might respond with "Why do you feel sad?" This approach created the illusion of understanding. It;s kinda interesting how everything feels kinda like ai these days, haha ive seen so many memes about the ai how it works if else condition, nested if else haha.. but those actually true i mean indeed the ai itself has if else condition and obviously is not just that modern ai has bunch of complex functions, neural network and etc. nevertheless, as a human, the ai has the special place in our heart, we still amazed how the ai evolve into something more human-like when we interact and we fall in love with that. That feeling is called eliza effect, when people think a computer or robot really understands them and has feelings, even though it's just following a program. -
raihan71 revised this gist
Nov 13, 2024 . 1 changed file with 7 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -99,4 +99,10 @@ In essence, NLP is what enables generative AI models to understand prompts and p ## Second chapter: LLMs LLM could take inserted parameter more than millinions parameter that's why it could recevied instruction with such complex languages or inputs. That's why with such a vast number of parameters, running these models requires a tremendous amount of processing power. That’s where specialized hardware, like GPUs, comes in. Unlike regular CPUs, GPUs are built to handle the parallel processing needed for deep learning tasks. When LLMs process text, they rely on GPUs (or even TPUs) to manage the heavy computational load involved in understanding and generating complex language patterns. This demand for powerful hardware has created a ripple effect, especially as LLMs have grown in size and popularity. The need for GPUs has skyrocketed, leading to an increase in their price. Developers, researchers, and companies everywhere were suddenly competing for these high-spec machines, which led to a notable spike in GPU prices. #### Natural Language Processing (NLP):Semantic Parsing How it works the llms take parameter then breaking down sentences to understand the structure and meaning of each component.Named Entity Recognition (NER): Identifying and classifying key entities (e.g., people, places, dates) within text.Sentiment Analysis: Determining the sentiment or emotion expressed in text. -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 12 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -85,7 +85,18 @@ the AI landscape has been evolved becoming more complex and, in many ways, more And behind of prompt there's the model that big company trained like chatgpt, if you take a look how it build was procced with so many subsets and been going on with the process how model tranied. ### NLP I want to highlight in here about deep learning and how it named with the gen-ai. so, gen ai is part of one of the algorithm in machine learning or you could say a subset, and the processing of how it made, was using the technique of deep learning and trained with a lots of data could be terabyte. and the technique of deep learning there's two: 1. discriminative, 2. generative - discriminative : here's how it works it classify first then it will deciding a cat or a dog - generative : the way it works it will generate new things for example image of a cat that's why based definition generative ai it self gen ai is an artificial inteligence to generate something new or making decision with the existing and it was designed to understand human naturally aka with human language could be english, mandarin, etc. and if you wondering what is the part of nlp in llm, if you take a look in the slide nlp is a branch of ai that focusing or enabling computer to understand, interpret and produce human language. In essence, NLP is what enables generative AI models to understand prompts and produce meaningful responses. It's like the "brain" behind the language skills of an LLM, allowing it to parse and generate text in a way that feels natural and relevant to us. ## Second chapter: LLMs LLM could take inserted parameter more than millinions parameter that's why it could recevied instruction with such complex languages or inputs. -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 3 additions and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -86,4 +86,6 @@ the AI landscape has been evolved becoming more complex and, in many ways, more And behind of prompt there's the model that big company trained like chatgpt, if you take a look how it build was procced with so many subsets and been going on with the process how model tranied. I want to highlight in here about deep learning and how it named with the gen-ai. so, gen ai is part of one of the algorithm in machine learning or you could say a subset, and the processing of how it made, was using the technique of deep learning and trained with a lots of data could be terabyte. and the technique of deep learning there's two: 1. discriminative, 2. generative -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 4 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -83,3 +83,7 @@ The momentum continued in 2022, with the emergence of open-source solutions from the AI landscape has been evolved becoming more complex and, in many ways, more specialized. and the AI landscape isn’t just one big field these days; it’s grown into a vast ecosystem with many specialized branches, or subsets, from computer vision to natural language processing (NLP), robotics, and, of course, large language models. And behind of prompt there's the model that big company trained like chatgpt, if you take a look how it build was procced with so many subsets and been going on with the process how model tranied. I want to highlight in here about deep learning and how it named with the gen-ai. so, deep learning: 1. discriminative, 2. generative -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 1 addition and 2 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -81,6 +81,5 @@ The current decade is already brimming with groundbreaking developments, taking The momentum continued in 2022, with the emergence of open-source solutions from collaborative endeavours of entities like Midjourney and Stability AI, amplifying the collaborative spirit in the AI community. the AI landscape has been evolved becoming more complex and, in many ways, more specialized. and the AI landscape isn’t just one big field these days; it’s grown into a vast ecosystem with many specialized branches, or subsets, from computer vision to natural language processing (NLP), robotics, and, of course, large language models. -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 15 additions and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -67,5 +67,20 @@ Same thing like computers, ai has evolved over the years. implementations it was built at MIT by Joseph Weizenbaum It worked by recognizing keywords in a user's statement and then reflecting them back in the form of simple phrases or questions, For instance, it the user types "I had a strange dream," the code picks up the key "dream," recognizes that the user claims to have had a dream, and may respond, "What does that dream suggest to you?" - The 1980s were a period of both strife and regeneration for the AI community. The decade kicked off with reduced funding, marking the onset of the ‘AI Winter.’ However, the first National Conference on Artificial Intelligence in 1980 kept the flames of innovation burning, bringing together minds committed to the growth of AI. A significant rebound occurred in 1986 with the resurgence of neural networks, facilitated by the revolutionary concept of backpropagation, reviving hopes and laying a robust foundation for future developments in AI. - 1990s: Revival and Emergence of Machine Learning, The 90s heralded a renaissance in AI, rejuvenated by a combination of novel techniques and unprecedented milestones. 1997 witnessed a monumental face-off where IBM’s Deep Blue triumphed over world chess champion Garry Kasparov. This victory was not just a game win; it symbolised AI’s growing analytical and strategic prowess, promising a future where machines could potentially outthink humans. Earlier, in 1996, the LOOM project came into existence, exploring the realms of knowledge representation and laying down the pathways for the meteoric rise of generative AI in the ensuing years. - 2000s: The Genesis of Generative AI As we rolled into the new millennium, the world stood at the cusp of a Generative AI revolution. The undercurrents began in 2004 with murmurs about Generative Adversarial Networks (GANs) starting to circulate in the scientific community, heralding a future of unprecedented creativity fostered by AI. The middle of the decade witnessed a transformative moment in 2006 as Geoffrey Hinton propelled deep learning into the limelight, steering AI toward relentless growth and innovation. - 2010s: Rise of AI and Breakthroughs, In 2011, IBM Watson emerged victorious on "Jeopardy!", demonstrating the mammoth strides AI had taken in comprehending and processing natural language, setting the stage for more sophisticated developments in language understanding. As we ventured into the 2010s, the AI realm experienced a surge of advancements at a blistering pace. The beginning of the decade saw a convolutional neural network setting new benchmarks in the ImageNet competition in 2012, proving that AI could potentially rival human intelligence in image recognition tasks. In 2014, Ian Goodfellow and his team formalised the concept of Generative Adversarial Networks (GANs), creating a revolutionary tool that fostered creativity and innovation in the AI space. The latter half of the decade witnessed the birth of OpenAI in 2015, aiming to channel AI advancements for the benefit of all humanity. 2016 marked the introduction of WaveNet, a deep learning-based system capable of synthesising human-like speech, inching closer to replicating human functionalities through artificial means. - 2020s: Generative AI Reaches New Horizons The current decade is already brimming with groundbreaking developments, taking Generative AI to uncharted territories. In 2020, the launch of GPT-3 by OpenAI opened new avenues in human-machine interactions, fostering richer and more nuanced engagements. 2021 was a watershed year, boasting a series of developments such as OpenAI’s DALL-E, which could conjure images from text descriptions, illustrating the awe-inspiring capabilities of multimodal AI. This year also saw the European Commission spearheading efforts to regulate AI, stressing ethical deployments amidst a whirlpool of advancements. The momentum continued in 2022, with the emergence of open-source solutions from collaborative endeavours of entities like Midjourney and Stability AI, amplifying the collaborative spirit in the AI community. In 2023, the AI landscape experienced a tectonic shift with the launch of ChatGPT-4 and Google’s Bard, taking conversational AI to pinnacles never reached before. Parallelly, Microsoft’s Bing AI emerged, utilising generative AI technology to refine search experiences, promising a future where information is more accessible and reliable than ever before. -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 7 additions and 9 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -35,22 +35,20 @@ Let's fast forward a few thousands years, and let's start with the first compute ### History Computers - Our story begins in the 1600s, when mathematicians and inventors seek to build machines to make calculations easier. Blaise Pascal, a French mathematician, created one of the first mechanical calculators, called the Pascaline, in 1642. - Fast-forward to the early 19th century, when English mathematician Charles Babbage envisioned a more complex machine. In 1837, he designed the Analytical Engine, a fully mechanical computer capable of performing any calculation. Babbage’s computer design laid the conceptual groundwork for modern computers. - The Birth of the Transistor: The 1950s coming - The IC (Integrated Circuit) Revolution: The 1960s-70s By the 1970s, the integrated circuit had enabled the creation of the microprocessor, a single chip that could function as a computer's CPU. - The 1980s marked the dawn of the personal computer (PC). - The Internet Age: 1990s - The Mobile and Cloud Computing Revolution: 2000s-2010s As we entered the 2000s, computers became even smaller and more portable. The smartphone, powered by advanced microprocessors and touchscreen technology, combined computing power with cellular communication. Apple’s iPhone, launched in 2007, redefined mobile computing, and soon, smartphones became essential tools in daily life, offering apps for almost anything imaginable. Over time, with advances in processing power and the introduction of the internet, computers became capable of handling enormous amounts of data. This shift paved the way for artificial intelligence as we know it—systems that could not only perform calculations but also analyze data and recognize patterns. But for a long time, AI was limited to specific tasks, unable to understand or generate human language effectively. -
raihan71 revised this gist
Nov 12, 2024 . 1 changed file with 8 additions and 2 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -1,7 +1,7 @@ hi everyone! it's great to be here today, i'm really excited! so, a few months ago i got diagnosed lymph gland by the doctor, something issue with my lymph nodes in my right legs, the doctor also said it's infected by bacterial same as bacteria in tuberculous aka Tuberculous (tuberkeles) Lymphadenitis (limpadenaitis). In my mind always kept saying like i'm afraid i can't make it, i'm afraid i can't make it to the conference. After surgery taking my medication, recovery. here i am, i'm not only standing here healed feeling better but also more passionate than ever to share my journey learning about the prompt engineering. @@ -63,5 +63,11 @@ If we see the patterns connecting all the dots somehow, it all connected it has Artificial Intelligence by the definition is technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy. Same thing like computers, ai has evolved over the years. - Foundation of ai was in the 1940s-1950s, where the first articial neurons we conceptualised and introduced by Warren McCulloch and Walter Pitts. in 1950s Alan introduced the world to the Turing Test, it was a framework to discern intelligent machines, setting the wheels for the giant computer or computer first generation. - Six years later, in 1956, a group of visionaries convened at the Dartmouth Conference hosted by John McCarthy, where the term “Artificial Intelligence” was first coined, setting the stage for decades of innovation. - Early development, in late 60s and 70s the first nlp integrated in computer application or we can say it the first chatbot implementations it was built at MIT by Joseph Weizenbaum It worked by recognizing keywords in a user's statement and then reflecting them back in the form of simple phrases or questions, For instance, it the user types "I had a strange dream," the code picks up the key "dream," recognizes that the user claims to have had a dream, and may respond, "What does that dream suggest to you?" - The 1980s were a period of both strife and regeneration for the AI community. The decade kicked off with reduced funding, marking the onset of the ‘AI Winter.’ However, the first National Conference on Artificial Intelligence in 1980 kept the flames of innovation burning, bringing together minds committed to the growth of AI. A significant rebound occurred in 1986 with the resurgence of neural networks, facilitated by the revolutionary concept of backpropagation, reviving hopes and laying a robust foundation for future developments in AI.
NewerOlder