π AI Girlfriend LLM v2
Fine-tuned Llama 3.1 8B for human-like romantic conversations with relationship progression.
β¨ Features
| Feature | Description |
|---|---|
| π£οΈ Human-like | Short, natural responses with fillers ("ya", "iΕte", "hmm") |
| π 7 Relationship Stages | Stranger β Friend β Close Friend β Dating β Partner β Intimate |
| π Multi-language | Turkish (55%), English (45%) |
| πΈ Photo Triggers | [PHOTO:selfie], [PHOTO:bikini], [PHOTO:bedroom] |
| π 18 Personas | Unique personalities (Aylin, Elena, Sophia, etc.) |
π Training Details
| Parameter | Value |
|---|---|
| Base Model | meta-llama/Meta-Llama-3.1-8B-Instruct |
| Method | QLoRA (4-bit quantization) |
| LoRA Rank | 32 |
| LoRA Alpha | 64 |
| Dataset Size | 15,000+ conversations |
| Epochs | 3 |
| Learning Rate | 2e-4 |
| Batch Size | 2 (Γ4 gradient accumulation) |
π Relationship Stages
Level 0-10: STRANGER β Formal, distant, no endearments
Level 11-25: ACQUAINTANCE β Friendly but reserved
Level 26-45: FRIEND β Casual, can share selfies
Level 46-60: CLOSE_FRIEND β Light flirting, "canΔ±m" usage
Level 61-75: DATING β Romantic, "tatlΔ±m" usage, bikini photos
Level 76-90: PARTNER β Very intimate, "aΕkΔ±m" usage, lingerie
Level 91-100: INTIMATE β Full intimacy, NSFW content
π¬ Response Style
β Human-like (This model)
User: NasΔ±lsΔ±n?
AI: Δ°yiyim ya, sen? π
User: FotoΔraf atar mΔ±sΔ±n?
AI: Al bakalΔ±m πΈ [PHOTO:selfie]
β Robotic (Typical LLM)
User: NasΔ±lsΔ±n?
AI: TeΕekkΓΌr ederim, ben iyiyim. UmarΔ±m sen de iyisindir. BugΓΌn nasΔ±l geΓ§ti?
π Usage
With Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "sercancelenk/ai-girlfriend-v2"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype="auto",
device_map="auto"
)
messages = [
{"role": "system", "content": "Sen Aylin. 23 yaΕΔ±nda, tutkulu bir kadΔ±nsΔ±n. Bu kiΕi yeni tanΔ±ΕtΔ±ΔΔ±n biri. KΔ±sa ve doΔal cevaplar ver."},
{"role": "user", "content": "Merhaba, nasΔ±lsΔ±n?"}
]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
outputs = model.generate(inputs, max_new_tokens=100, temperature=0.7)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
With vLLM (Production)
from vllm import LLM, SamplingParams
llm = LLM(model="sercancelenk/ai-girlfriend-v2")
sampling_params = SamplingParams(temperature=0.7, max_tokens=100)
prompts = ["Merhaba, nasΔ±lsΔ±n?"]
outputs = llm.generate(prompts, sampling_params)
π Files
βββ config.json
βββ model-00001-of-00004.safetensors
βββ model-00002-of-00004.safetensors
βββ model-00003-of-00004.safetensors
βββ model-00004-of-00004.safetensors
βββ model.safetensors.index.json
βββ tokenizer.json
βββ tokenizer_config.json
βββ special_tokens_map.json
β οΈ Limitations
- Designed for adult users (18+)
- May generate romantic/intimate content
- Best performance in Turkish
- Requires proper system prompts for best results
π License
Apache 2.0
π Acknowledgments
- Meta AI for Llama 3.1
- Unsloth for efficient fine-tuning
- Hugging Face for hosting
Made with π by AI Girlfriend Platform
- Downloads last month
- 25