Table of Contents

Class ModelParameters

Namespace
OobaboogaAPIHelper
Assembly
DougWare.OobaboogaAPIHelper.dll

ModelParameters represents the parameters used to control a chat model's text generation process.

public class ModelParameters
Inheritance
ModelParameters
Derived
Inherited Members

Properties

add_bos_token

If true, adds a beginning-of-sentence token to the prompt. Some models need this.

public bool add_bos_token { get; set; }

Property Value

bool

ban_eos_token

If true, forbids the model from ending the generation prematurely. Some models need this to be unset.

public bool ban_eos_token { get; set; }

Property Value

bool

do_sample

Controls whether to use contrastive search during generation.

public bool do_sample { get; set; }

Property Value

bool

early_stopping

If true, the generation stops as soon as the model produces a token that satisfies the constraints.

public bool early_stopping { get; set; }

Property Value

bool

encoder_repetition_penalty

The "Hallucinations filter", penalizes tokens that are not in the prior text. Higher values make the model stay more in context, lower values allow it to diverge.

public double encoder_repetition_penalty { get; set; }

Property Value

double

length_penalty

The factor by which to multiply the length of the generated text when scoring it.

public double length_penalty { get; set; }

Property Value

double

max_context_length

This is the maximum string length for a prompt that the API will consume. If your prompt exceeds this length, it will be truncated from the top.

public int max_context_length { get; set; }

Property Value

int

max_new_tokens

Maximum number of new tokens to generate, affects the length of generated text. This is very important as it directly affects the prompt length inversely to the length of the generated text. Make it small for a bigger prompt. Large for a bigger response.

public int max_new_tokens { get; set; }

Property Value

int

min_length

The minimum length of the generated text in tokens.

public int min_length { get; set; }

Property Value

int

no_repeat_ngram_size

Specifies the length of token sets that are completely blocked from repeating at all. Higher values block larger phrases, only 0 or high values are recommended.

public double no_repeat_ngram_size { get; set; }

Property Value

double

num_beams

The number of beams in beam search. Higher values can provide better results but use more VRAM.

public int num_beams { get; set; }

Property Value

int

penalty_alpha

Penalty alpha parameter, not clearly documented.

public double penalty_alpha { get; set; }

Property Value

double

repetition_penalty

Applies an exponential penalty to repeating tokens. 1 means no penalty, higher values reduce repetition, lower values increase repetition.

public double repetition_penalty { get; set; }

Property Value

double

seed

The seed for random number generation. Use -1 for random.

public int seed { get; set; }

Property Value

int

skip_special_tokens

If true, the model will skip special tokens in the output. Unsetting this can make replies more creative.

public bool skip_special_tokens { get; set; }

Property Value

bool

temperature

Controls the randomness of outputs. 0 makes output deterministic, while higher values increase randomness.

public double temperature { get; set; }

Property Value

double

top_k

Selects only the top_k most likely tokens, similar to top_p. Higher values increase the range of possible random results.

public int top_k { get; set; }

Property Value

int

top_p

When not set to 1, only tokens with probabilities adding up to less than this number are selected. Higher values increase the range of possible random results.

public double top_p { get; set; }

Property Value

double

truncation_length

The maximum length of the generated text. If a prompt exceeds this length, the leftmost tokens are removed. Most models require this to be at most 2048.

public int truncation_length { get; set; }

Property Value

int

typical_p

Selects only tokens that are at least this much more likely to appear than random tokens, given the prior text.

public double typical_p { get; set; }

Property Value

double

Methods

FromEmbeddedResource<T>(string)

public static T FromEmbeddedResource<T>(string resourceName) where T : ModelParameters, new()

Parameters

resourceName string

Returns

T

Type Parameters

T

FromFile<T>(string)

public static T FromFile<T>(string filePath) where T : ModelParameters, new()

Parameters

filePath string

Returns

T

Type Parameters

T