LocalAI/extra/grpc
Max Cohen f9d2bd24eb
Allow to manually set the seed for the SD pipeline (#998)
**Description**

Enable setting the seed for the stable diffusion pipeline. This is done
through an additional `seed` parameter in the request, such as:

```bash
curl http://localhost:8080/v1/images/generations \
    -H "Content-Type: application/json" \
    -d '{"model": "stablediffusion", "prompt": "prompt", "n": 1, "step": 51, "size": "512x512", "seed": 3}'
```

**Notes for Reviewers**
When the `seed` parameter is not sent, `request.seed` defaults to `0`,
making it difficult to detect an actual seed of `0`. Is there a way to
change the default to `-1` for instance ?

**[Signed
commits](../CONTRIBUTING.md#signing-off-on-commits-developer-certificate-of-origin)**
- [x] Yes, I signed my commits.
 

<!--
Thank you for contributing to LocalAI! 

Contributing Conventions:

1. Include descriptive PR titles with [<component-name>] prepended.
2. Build and test your changes before submitting a PR. 
3. Sign your commits

By following the community's contribution conventions upfront, the
review process will
be accelerated and your PR merged more quickly.
-->
2023-09-04 19:10:55 +02:00
..
autogptq feat: Allow to load lora adapters for llama.cpp (#955) 2023-08-25 21:58:46 +02:00
bark feat: Allow to load lora adapters for llama.cpp (#955) 2023-08-25 21:58:46 +02:00
diffusers Allow to manually set the seed for the SD pipeline (#998) 2023-09-04 19:10:55 +02:00
exllama feat: Allow to load lora adapters for llama.cpp (#955) 2023-08-25 21:58:46 +02:00
huggingface feat: Allow to load lora adapters for llama.cpp (#955) 2023-08-25 21:58:46 +02:00