Skip to content

luka-group/MonoPara

Repository files navigation

MonoPara

Code for Monotonic paraphrasing improves generalization of language model prompting.

  • To generate paraphrase with MonoPara, use the function run_PPL_greedy in main_down_stream.py.
  • The logits ensemble is at here. We manipulate the predicted logits at every decoding step.

Notes:

  • The larger alpha means more dependent on logit_for_next_step_target, which means assigning higher weight to the token that has the highest probability predicted by the target model, which is equivalent to lower perplexity wrt. target model. We set alpha=0.5 for all the experiments.
  • Ignore the option parameter and leave it as default.
  • We use model mistralai/Mistral-7B-Instruct-v0.1 as both target model and paraphrase model.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published