See the llama-recipes repo for an example of how to add a safety checker to the inputs and outputs of your inference code. Examples using CodeLlama-7b-Instruct: torchrun --nproc_per_node 1 example_instructions.py \ --ckpt_dir CodeLlama-7b-Instruct/ \ --tokenizer_path CodeLlama-7b-...