Introduction to Fabric
The quality of results when using Large Language Models (LLMs) like Ollama or ChatGPT heavily depends on the quality of the prompts used. This understanding has led to the emergence of specialized professions like Prompt Engineers, whose task is to configure LLMs optimally to achieve the best possible results for specific requirements or contexts. In addition, hundreds of articles have been published with promising titles like “The 100 Best Prompts for Generating a Job Application for a Top-Manager Position.”
Fabric, the tool introduced here, aims to simplify and optimize the prompt creation process to obtain the best results from a LLM. This article explains the installation and basic usage.
What is Fabric?
The Fabric project by Daniel Miessler was developed to enable LLM users to refocus on content instead of getting lost in countless prompt experiments. Fabric offers a library of over 140 so-called patterns tailored to specific tasks and problems. Examples include “extract_wisdom,” which extracts insights and key points from a text or YouTube video, and “create_coding_project,” which helps create the framework for a development project. The patterns describe in great detail which steps are performed with the input, the role the LLM should play, what it should pay special attention to, and how the output should be structured.
Fabric combines the input, whether it’s a text or a task context, with the chosen pattern and passes it to an LLM. This can either be done locally, as with LLMs provided by Ollama, or through cloud services like OpenAI, Claude, or Google.
As of late August 2024, Fabric is only a command-line tool. The previous Python version had a GUI and server interface, but these are not currently available.
Installation of Fabric
The installation is done through the project’s GitHub repository. Fabric is written in Go, and some additional tools are needed to run the examples. Therefore, some preparations are necessary before installation. Since the installation and later usage happen in the Terminal program, some experience with the command line is recommended. Feel free to ask any open questions in the comments; I will try to answer them as quickly as possible.
Preparation for Installation
First, go
, git
, and ffmpeg
may need to be installed. I use the package manager Homebrew for this purpose. If Homebrew is installed on the Mac, the necessary tools can be quickly installed in the Terminal with the following command:
brew install git go ffmpeg
If the programs were already installed, no reinstallation will occur.
After installing GO on my Mac, the search paths were not added to the .zshrc file. For a go
installed with brew
, a slightly different path for GOROOT is needed than described in the instructions on GitHub. To add these search paths, open the ~/.zshrc file in an editor:
# I use nano as an editor
nano ~/.zsh
Then scroll to the end of the file and enter the following:
# Set Go Path
export GOROOT="$(brew --prefix golang)/libexec"
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH:
If Go was not installed with Homebrew, the GOROOT might look different. However, I cannot validate this:
export GOROOT=/usr/local/go
With ⌃x⌃y
the editor is closed and the file is saved. Now, with the command
source ~/.zshrc
the changes are made known to the system.
Steps to Install Fabric
First, the latest version of Fabric is installed from the GitHub repository using the go
command:
go install github.com/danielmiessler/fabric@latest
Then, the setup is started with the following command:
fabric --setup
If an error occurs when running fabric --setup
, indicating that Fabric was not found, it may be due to incorrect entries in the ~/.zshrc
file, or the source
command was not executed. Alternatively, you can close the Terminal window and try running the setup again in a new window.
During the setup process, you will first be prompted to enter the API keys for services like Groq, Gemini, Anthropic, OpenAI, and Azure. If you don’t plan to use any of these services, you must at least install Ollama and provide the Ollama URL, which is typically http://localhost:11434
. After entering the information for one or more of these services, a list of all available LLMs will be generated, from which you can select the default LLM. Following this, you will be asked to enter the YouTube API key. For the final prompts under the “Patterns Loader” section, I accepted the suggested values.
Once these inputs are completed, the patterns are downloaded and installed in $HOME/.config/fabric/patterns
. Additionally, a .env
file is created in the $HOME/.config/fabric
directory, where the provided information is stored. A file named unique_patterns.txt
contains a list of all patterns. In the context
folder, you can store text files that provide additional context when invoking Fabric. This allows you to define general supplementary information once and apply it across different patterns as needed.
The purpose of the sessions
folder, which is also created during setup, is still unclear to me.
Basic Usage of Fabric
Fabric is a command-line tool that adheres to the Unix philosophy of flexibly combining small, specialized programs to create more complex workflows. Programs are connected using the pipe symbol (|), where the output (stdout) of one program becomes the input (stdin) of the next. In a workflow with Fabric, any command-line tool that generates output on stdout or consumes input via stdin can be utilized. For example, echo
can be used for text input, or cat
can be used to pass a text file. Additionally, results can be written to a file using >
or appended to a file with >>
.
The following command line demonstrates how to create a simple workflow with echo and Fabric that processes a text and generates a prompt suggestion for DALL‑E or Stable Diffusion to create a logo:
echo "Two parrots on a skyscraper rooftop" | fabric --stream --pattern create_logo
In this example, the echo command outputs the text “Two parrots on a skyscraper rooftop,” which is then passed to the Fabric command via the pipe symbol (|). Fabric processes this text using the create_logo pattern, sends the result to the LLM, which in turn generates a prompt suggestion that is displayed in the command line. The LLM, in this case, GPT-4o-mini, generated the following prompt:
A simple, vector graphic logo featuring two stylized parrots perched on the edge of a minimalist skyscraper rooftop. The design should emphasize elegance and simplicity, using clean lines and a limited color palette to convey a modern urban feel
I manually submitted the command to ChatGPT and received the following image:

This approach allows for seamless integration of different command-line tools into a cohesive workflow with Fabric, providing flexibility and power in handling various text processing tasks.
The following example demonstrates how to create a summary of a webpage using Fabric (Note that wget
and pandoc
may need to be installed using Homebrew):
wget -qO - https://example.com/my-blog-article | pandoc -f html -t plain | fabric --stream --pattern summarize
In this example:
wget
downloads the webpage content.- The HTML code is passed to
pandoc
, which converts it to plain text. - The plain text is then piped to Fabric, which uses the
summarize
pattern to process the text with the LLM, and the summary is output.
To process a specific section of text from a document and directly reinsert the result back into the document, the pbpaste
and pbcopy
commands on macOS are particularly useful. These commands allow you to pass the clipboard content to Fabric, then copy the output back to the clipboard, which can subsequently be pasted directly into the document using ⌘-V.
Here’s how you can do it:
pbpaste | fabric -sp improve_writing | pbcopy
In this workflow:
pbpaste
takes the current content of the clipboard and passes it to Fabric.- Fabric processes the text using the
improve_writing
pattern. - The processed text is then copied back to the clipboard using
pbcopy
. - You can then paste the improved text back into your document by pressing ⌘-V.
This method streamlines the editing process, making it easy to enhance specific sections of your text and quickly reintegrate them into your documents.
Advanced Features of Fabric
In the old Python version of Fabric, additional tools like yt
for transcribing YouTube videos, ts
for transcribing audio files, and save
for saving the output to a defined folder were included. As of August 24, 2024, only the yt
command has been ported to the new version. It is currently uncertain whether the other tools will be reintroduced.
However, the yt
command needs to be installed separately in this version:
go install github.com/danielmiessler/yt@latest
The yt
tool extracts spoken text from a YouTube video, which can then be processed by Fabric.
Here’s how you can use it:
yt --transcript https://www.youtube.com/watch?v=UbDyjIIGaxQ | fabric --stream --output $HOME/Video_transcript.md --pattern extract_wisdom
In this example, the text from the video “You’ve Been Using AI Wrong” by NetworkChuck is extracted and analyzed. The key insights, according to the defined rules of the extract_wisdom
pattern, are saved in a Markdown file named Video_transcript.md
in the home directory.
The video “You’ve Been Using AI Wrong” is what brought Fabric to my attention. It’s worth watching, though it discusses the old, Python-based version of Fabric.
As a replacement for the save command, which allowed saving the output to a defined location, I use the following shell script. I save it under $HOME/Applications (which is included in my search path) with the name save, and make it executable with the following command:
chmod +x $HOME/Applications/save
Here is the script:
#!/bin/zsh
export PATH=$HOME/Applications/:$PATH:
# Define the directory where the file will be saved
TARGET_DIR="$HOME/Documents/fabric_files"
# Create the directory if it doesn't exist
mkdir -p "$TARGET_DIR"
# Set the current date as a prefix for the filename
DATE_PREFIX=$(date +"%Y-%m-%d")
# Set the base filename
BASENAME="note"
EXTENSION=".md"
# Generate the initial filename
FILENAME="${DATE_PREFIX}-${BASENAME}${EXTENSION}"
# Check if the file exists and find an incremental number
COUNTER=1
while [[ -e "${TARGET_DIR}/${FILENAME}" ]]; do
FILENAME="${DATE_PREFIX}-${BASENAME}-${COUNTER}${EXTENSION}"
COUNTER=$((COUNTER + 1))
done
# Read the text from the pipe and save it to the file
cat > "${TARGET_DIR}/${FILENAME}"
Now, the result of a Fabric command can easily be saved in the Documents/fabric_files folder under the name 2024–08-23-note.md using the following command:
cat ~/Documents/Article_Draft.md | fabric -sp summarize | save
Useful Commands for Fabric
Before diving into the real magic of Fabric, namely the patterns, here are some important commands:
fabric --listpatterns # This command lists all available patterns.
fabric --listmodels # This command lists all available LLM.
fabric --changeDefaultModel # This command allows you to change the default LLM.
fabric --model llama3.1:latest -sp summarize # This command temporarily changes the LLM to `llama3.1:latest` for this specific call.
fabric --output myFile.md -sp summarize # This command saves the output of the Fabric call to `myFile.md`.
fabric --context blogger-support -sp summarize
As previously mentioned, you can store files in the ~/.config/fabric/context
directory that provide additional information for the Fabric command. For example, in the blogger-support
file, you could define additional rules that are helpful in the context of writing a blog article. So these rules can be used not only with one pattern but also with the --context
option in various different pattern calls.
cat ~/.config/fabric/.env #This command shows the current configuration stored in the `.env` file.
Additional Help
For more options and explanations, you can use the following commands:
fabric --help
or
yt --help
These commands provide a detailed list of available options and their explanations for both Fabric and the YT tool.
The Importance of Patterns in Fabric
Patterns are the core of Fabric, as they efficiently provide proven prompts that are mixed with your own content and sent to an LLM for processing. These patterns can be easily adapted to your specific needs, or you can convert your own tried-and-true prompts into patterns directory to use with Fabric.
During installation, around 140 patterns are installed (and the number keeps growing). These patterns are typically divided into sections. Most patterns begin with the “IDENTITY and PURPOSE” section, which describes the role the LLM should play and the tasks it has to perform. This is followed by sections with instructions on how the LLM should approach the analysis of the input and what it should pay particular attention to. The “OUTPUT” section describes the format of the output. The final section is the “INPUT,” which will later be supplemented with your input.
For example, the write_essay pattern includes the following instruction in the “IDENTITY and PURPOSE” section:
# IDENTITY and PURPOSE You are an expert on writing concise, clear, and illuminating essays on the topic of the input provided.
However, this alone is not sufficient, as the LLM also needs to understand what Paul Graham’s style looks like. Therefore, the pattern includes text examples totaling 9,025 words that define the typical essay style of Paul Graham. Following this, there are also some detailed output instructions.
This is, of course, a pattern that begs to be customized. For instance, instead of Paul Graham, you could use your own name, the text examples could be taken from your own writings, and the instructions could be adjusted so that the output is an article for a magazine rather than an essay.
As a German native speaker, you might find it especially useful to include an instruction in the “OUTPUT INSTRUCTIONS” section that ensures the output is in German:
- Make the output in German.
When customizing patterns, it’s advisable to make changes in a copy of the pattern folder, which you can then rename to something like my_summary. The developers recommend saving these customized folders in a separate directory and only placing a copy in the patterns folder. This precaution is important because the original patterns, and potentially your customized patterns stored in that folder, may be overwritten during an updates
Using Context Files: Another way to adopt Patterns
As mentioned above, using the –context option is another effective method to tailor the output to your personal needs. Let’s continue with the example of ensuring the output is in German. To do this, create a file in the context directory:
nano ~/.config/fabric/contexts/german
And include the following content:
Make the output in German
Now, whenever you call Fabric with the –context=german option alongside any pattern, the output will be in German.
For example:
echo "Why is the sky is blue" | fabric --context german --pattern create_academic_paper
In this case, Fabric will process the input and generate the output in German, regardless of the pattern used. This approach provides a flexible and reusable way to ensure consistent output across various tasks and patterns.
But using context files in Fabric offers far more possibilities than simply setting a language preference. The context feature allows you to include a wide range of instructions that can enhance or modify the behavior of multiple patterns, making it a powerful tool for customizing how Fabric processes your input.
For example, suppose you want to consistently generate academic content with a formal tone, (still) ensure that the output is in German, and include citations in APA style. You can achieve this by creating a context file that encapsulates all these requirements:
nano ~/.config/fabric/contexts/academic_german
Then, include the following content:
Make the output in German
Use a formal academic tone
Include citations in APA style
Ensure the content is well-structured with clear headings and subheadings
Now, whenever you invoke Fabric with this context, it will apply these instructions in addition to the pattern’s default behavior:
echo "The effects of climate change on Arctic wildlife" | fabric --context academic_german --pattern create_academic_paper | pandoc -f latex -o output.pdf
This example creates the paper in German and uses pandoc
to convert fabric
’s output directly to a PDF.
Context files can be expanded to include any instructions that are useful to you, such as:
- Target audience considerations: Tailoring content for specific readers, like industry professionals or a general audience.
- Specific writing styles: Adopting the style of a particular author or publication.
- Formatting guidelines: Ensuring consistent formatting, such as bullet points, lists, or paragraph styles.
By leveraging the power of context files, you can create versatile, reusable configurations that extend and enrich the functionality of Fabric patterns, making your workflows more efficient and tailored to your exact needs.
Optimizing the Fabric Call
In the old version of Fabric, a file was included that created an alias definition for each Fabric pattern call and made these aliases available through the initialization of the .zshrc file. Since the new version no longer provides this functionality, I have recreated it as a shell script. In this example, as a German speaker, I added the context file from above to ensure that all output is in German.
#!/bin/bash
# Fixed path to the directory to be scanned
input_dir="$HOME/.config/fabric/patterns"
# Check if the path is a valid directory
if [ ! -d "$input_dir" ]; then
echo "The directory $input_dir does not exist or is not a valid directory."
exit 1
fi
# Filename for the .inc file
output_file="$HOME/.config/fabric/fabric-bootstrap.inc"
# Initialize the .inc file
echo "# Automatically generated aliases" > "$output_file"
# Loop through the subdirectories in the specified directory
for dir in "$input_dir"/*/; do
# Check if it is indeed a directory
if [ -d "$dir" ]; then
# Remove the trailing slash and the path to get only the folder name
folder_name=$(basename "$dir")
# Create an alias in the form: alias foldername="fabric -sp foldername"
echo "alias $folder_name=\"fabric -C=german -sp $folder_name\"" >> "$output_file"
fi
done
echo "The file $output_file was successfully created."
Next, add the following entry at the end of your ~/.zshrc
file:
# Created by `Fabric Installation` on 2024-08-06 12:45:30
if [ -f "/Users/leifjp/.config/fabric/fabric-bootstrap.inc" ];
then . "/Users/leifjp/.config/fabric/fabric-bootstrap.inc";
fi
Activate the new .zshrc
with:
source ~/.zshrc
Now, a command like:
wget -qO- https://example.com/my-blog-article | pandoc -f html -t plain | summarize
will output the summary in German instead of requiring:
wget -qO- https://example.com/my-blog-article | pandoc -f html -t plain | fabric -sp summarize
This approach works with any pattern without needing modification. Of course, you can also add additional general instructions to the context file.
Conclusion and Outlook
I hope this brief introduction to Fabric makes getting started a bit easier. In a future post, I will demonstrate how to further simplify the use of Fabric and how it can be integrated with Apple’s Shortcuts.
If you have any questions, feedback, or ideas regarding Fabric and its installation, please feel free to use the comments section.
Leave a Reply