How to prompt in Chat GPT

Understanding Prompt Engineering Basics!

Hi all,
We are in the AI age that machines can think themselves!? and take a decision in our real life, In future it may insist to humans to take the same decision what they took. Well I am not exaggerated it could be happen because of the Generative Pre-Trained(GPT) AI. It is the open source and we can ask whatever information we required it gives the results based on how we chatting with those model. I don’t go deeply how the GPT model is working, but it is essential to know all of us how to talk with those models and how effectively we utilize them instead of impact with them.

On Nov,2022 Chat GPT is introduced from Open AI from that everybody start using the GPT engines and the traditional search engine users prompt is reducing slowly due to GPT model is giving the results dynamically to each and everyone. Also Microsoft introduced Co-Pilot, Google Introduced Bard(now Gemini) and bla bla bla… Also there is another dark side of GPT also there. It created cooked up content, false, biased views also. So my simple advice is don’t trust the GPT models blindly without make sure double/triple check.

In this article we will what is the prompt engineering and how effectively we use the GPT to our daily lives.

In software world there is a famous word(acronym) used very often GIGO(Garbage In/Garbage Out). what we give in the input, output is based on the that. This is more applicable to the AI especially with Generative Trained Models. We know the free version personal model is the most used one as of now. LLM(Large Language Model) is trained if you talk and interact with them seamlessly the model will give more and more better results to you because GPT is working behind LLM that is token based function on each and every word we used. Based on the prompt(Ask) we give to them the models goes to take all the related information of every input word and give the appropriate results to us.

The main advantage of GPT is we don’t required specific knowledge on the topic, We can query by natural language most preferred language is English, of course there is other languages also used as the prompt but comparing to English that is one step behind.

Lets go to the basics of Prompt engineering. When we discuss about prompt engineering then we should know about these terms Zero-Shot, One-Shot, Few-Shot.
Zero-Shot: This model is given a prompt without any examples
One-Shot: Provide single example
Few-Shot: More Examples is used to align the model’s thinking to get specified or desired result.

I would suggest at least go with One-Shot. But If you expect precise results compared to others then you should provide more and more examples to them. This was the first rule you should always keep in mind, also you give the thumbs u & down to train the model even more by giving the feedbacks.

I will provide some tips to you so that you can use the GPT models and get the as much as value possible.

  1. Rename the Models – We can ask multiple things to Chat GPT in a single day, so It is better to organize those model and keep reuse to get the better results. so we could rename the history of chats we prompt.

2. Using Persona – When we start ask to the model it is better to make those particular model as persona
for ex: “give me a tip to get fit
instead of this we can make the model as “Acting as a fitness expert, and give me the tip to get fit“.
Also we can add more real time data like “Acting as a fitness expert who knows the latest research on the fitness and provide the detailed solutions“.

3. Interview Pattern – Based on the above persona we can make the model as interview pattern like ask one by one query.

4. Chain of Thought – Like we ask more complex query we have to give the conditions and instruction as well as tips or hint to do the operation. when you dealing this type of query try using “lets think step by step” and “Let’s work this out in a step by step way to be sure we have the right answer“. May this two phrase will helpful to get the better output. Disclaimer this is time consuming.

5. Tree of Thought – We know the CoT(Chain of Thought) that is like step by step to come to solution, whereas (ToT)Tree of Thought is expect multiple way of solution for a single query. Example “Think 3 brilliant or expert approaching this same query and provide me the solution“. You will get three types of answer or methods for a single complex query. If you see you could apply all the one we mentioned above like using persona, interview based, CoT then you could get more precise too.

6. Verbosity – By giving more custom instructions and guidelines to them with some level of scaling and expect a answer on the low,medium,high scale or level the output is different. Example is explain to a school student, college student, Phd researcher.

As we conclude, The AI has made our life much more easier but it will be hard if that was become a monopoly to your brain. So wisely use the LLM’s GPT models for your prompts and must advice is don’t forget to make sure double/triple check. Also Don’t expect the answer from the chat GPT try to give more definitions, inputs, examples to them and train them. GPT giving suggestions only not a only option!


Leave a Reply

Your email address will not be published. Required fields are marked *