Posts

Showing posts from July, 2025

Prompt Engineering for Chatgpt (4)

Image
Few-Shot Prompt :  Few-shot Examples with Intermediate Steps: We can use this pattern for the costume service. Chain of Thought Prompting:  Add reasoning/your thoughts when providing examples to Chatgpt  ReAct (Reason+Act) Prompting: Tech essentially the LLM how to think through a process and understand steps where it needs to go and use some other toll to take some other action outside of it. The system is learning to choose the right tools at the right time to solve problems effectively. It can be integrated with other tools through well-designed prompts, enabling more complex automation. One popular framework for building such systems is LangChain, which uses React under the hood to support these advanced capabilities Using Large Language Models to Grade Each Other

Permutation Tests in Statistics and Clinical Trials Development: Sample Code

 Description:  1. Validate the permutation test table using the measured values. There are some missing values. You can adjust the dpl dataset for other imputed data    2. Use the planned allocation ratio of placebo to Active treatment as 2:1. Although the actual data may not perfectly match this ratio, treatment groups in simulations are assigned based on this planned allocation using total number of subjects with available data for change from baseline.   3. Permutation strategy:  • If the sample size is = 14, perform full permutations. • If the sample size is > 14, perform 10,000 iterations.  4. For the two-sample t-test, use the pooled variance method, which assumes equal variances between the two treatment groups.  5. To run the analysis, adjust the dataset named dpl accordingly. /*****Sample SAS code  ******/ ** One sample test and inference: ** 0. Calculated Observed t value for change ** 1. Get the random sample b...

Prompt Engineering for Chatgpt (3)

Image
 1.1 Question Refinement Pattern  1.2 Format of the Question Refinement Pattern To use this pattern, your prompt should make the following fundamental contextual statements: From now on, whenever I ask a question, suggest a better version of the question to use instead (Optional) Prompt me if I would like to use the better version instead Examples: From now on, whenever I ask a question, suggest a better version of the question to use instead From now on, whenever I ask a question, suggest a better version of the question and ask me if I would like to use it instead Tailored Examples: Whenever I ask a question about dieting, suggest a better version of the question that emphasizes healthy eating habits and sound nutrition. Ask me for the first question to refine. Whenever I ask a question about who is the greatest of all time (GOAT), suggest a better version of the question that puts multiple players unique accomplishments into perspective Ask me for the first question to ref...

Prompt Engineering for Chatgpt (2)

Image
 2. Introducing New Information to the Large Language Model  3. How to deal with Prompt Size Limitations ? ✅ 1. Query Only What’s Relevant (Selective Retrieval) Idea: Instead of loading entire documents, pull only the parts that matter to the task. Example: You're working with a collection of meeting transcripts and want insights on a decision about budget planning. Instead of this (too long): “Here are 20 pages of meeting notes…” Do this: Use a search function or embedding model to extract only the paragraphs mentioning “budget,” “finance,” or “cost estimate.” Prompt to LLM: “Here are excerpts from recent meetings related to budget planning. Please summarize the decisions made and any unresolved issues.” ✅ 2. Filter Out Extraneous Information Idea: Manually or programmatically remove boilerplate, irrelevant sections, or repeated text before prompting the model. Example: You have product reviews, but many just repeat phrases like “great product” or incl...

Summary of Prompt Patterns for Conversational LLMs

 Here is a summary of the prompt patterns described in the document, along with original examples: 1. Meta Language Creation Pattern ​ Purpose : Define a custom language or shorthand notation for the LLM to understand. ​ Example : "From now on, whenever I type two identifiers separated by a '! ​ ', I am describing a graph. ​ For example, 'a ! ​ b' means a graph with nodes 'a' and 'b' and an edge between them." ​ 2. Output Automater Pattern ​ Purpose : Generate scripts or automation artifacts to perform recommended steps. ​ Example : "Whenever you generate code that spans more than one file, create a Python script to automatically create the specified files or make changes to existing files." ​ 3. Flipped Interaction Pattern ​ Purpose : Have the LLM ask questions to gather information and achieve a goal. ​ Example : "Ask me questions to deploy a Python application to AWS. ​ When you have enough informat...

Prompt Engineering for Chatgpt (1)

 1. The persona pattern  Example :  Designing Prompts for Kindness: The "Helpful Assistant" Pattern As AI assistants become increasingly integrated into our daily lives, ensuring they interact respectfully and helpfully is critical. One powerful approach is to design prompts that guide the AI’s behavior through clear, context-setting instructions. A great example of this is the "Helpful Assistant" pattern —a structured way to prompt AI models to be both cooperative and kind. 🧠 What is the "Helpful Assistant" Pattern? This pattern is about embedding core behavioral expectations into your prompt. These statements shape how the AI interprets its role and ensure it avoids generating harmful or negative responses. 🔑 Fundamental Contextual Statements: At the heart of this pattern are three key ideas: You are a helpful AI assistant. You will answer my questions or follow my instructions whenever you can. You will never answer in a way that is insulting...