← 返回列表

AI ƒe kpeɖedzixɔxɔ 11: Aleke míate ŋu ahe RAG ɖe edzi ɖo?

RAG ƒe ɖoɖo ɖe edzi ɖo menye nɔnɔme ɖeka ko o, ke boŋ enye mɔfiame bliboa me tɔxɛ ɖoɖo ƒe dɔwɔwɔ. Le afi sia me, mele data index, kxɔxɔ, gbɔgblɔ, kple ʋɔtɔ ɖoƒe ene me la, mawɔ ɖoɖo ɖe edzi ɖo ƒe nuwɔna bliboa, eye makpe ɖe dɔwɔwɔ si wòate ŋu aƒo nu tsoe le kpeɖedzixɔxɔ me la ŋu.


ɖeka, Data Index ƒe ɖoɖo ɖe edzi ɖo (Eƒe nuƒoƒo kple gɔmeɖoɖo)

Esiae nye dɔwɔwɔ si womebu kpɔ o, gake enye afi si wozu nyui ɖe edzi.

Nusɔsrɔ̃ Nɔnɔme Nu wɔna Eƒe dzesi
Aɖaŋuƒoƒo PDF me tɔwɔwɔ siwo le kpeɖeɖe kple mlɔmlɔa me wole ŋu, alo nuŋɔŋlɔ siwo meɖe o, alo ɖoɖo si mele se dedie o. Zã parsing library nyui (abe unstructured, pypdf ƒe layout dzadzraɖo); na kpeɖeɖe la, zã pandas he ɖe go, eye wòazu Markdown. Kpekpe +5~15%
Chunk ƒe gãdodo Chunk sue ɖo tsi nuƒoƒo (abe "Eƒe ƒe viɖe dzi yi" me la, "e" meʋli o); chunk gã ɖo wɔna be kpataɖeɖe gbɔ. Dze aga gblɔ chunk size (256/512/768 token), overlap ɖe 10~20%; na nuŋɔŋlɔ gãwo, kpe ɖe gɔmeɖoɖo (akpa/atade) ŋu, menye didi ƒe kpekpe o. Xexeame / ʋuʋu
Metadata kpekpe Exe akpa si ku ɖe eŋu, gake mese afi si tsoe o, alo nɔnɔme, alo wòdi be woafia kplɔ̃ ɖoƒe. Na metadata ɖe chunk ɖe sia ɖe nu: source (ŋkɔ/URL), timestamp, page_num, doc_type. Le kxɔxɔ me, zã sɔselɛ (abe doc_type == 'legal'). Sɔselɛ ƒe nuteƒe
Embedding model tiatia Embedding ƒe wɔna ɖe nuƒoƒo tɔxewo (dɔla, kɔdzi, se) me le vɔvɔ. Zã model si wogblɔ ɖe nuƒoƒo ƒe akpa (BGE‑large‑zh, GTE‑Qwen2‑7B‑instruct) alo gblɔ ɖe ɖokuiwɔna embedding (zã triplet loss). Kxɔxɔ MRR@10 +10~20%

eve, Kxɔxɔ ƒe ɖoɖo ɖe edzi ɖo (Na "agbalẽ ʋuʋu" nanye nyuie wu)

Kxɔxɔ na nu si wòana LLM la, enye "nuƒoƒo si wòazã" la.

Nusɔsrɔ̃ Nɔnɔme Nu wɔna Eƒe dzesi
Kxɔxɔ ƒe sɔsrɔ̃ Vector kxɔxɔ mate ŋu aɖe nu tɔxɛ siwo le nyatakaka mɔfɔ̃ me o (abe product model ABC-123), keyword kxɔxɔ mate ŋu aʋa nyatakaka siwo le gɔmeɖeɖe me. Zã vector kxɔxɔ (gɔmeɖoɖo) kple BM25 (nyatakaka) ɖeka, to hehe (abe 0.7vector + 0.3BM25) alo rerank me. Kpekpe +10~25%
Rerank Vector kxɔxɔ trɔna ɖe nu etɔ̃ gbãtɔwo me nye nyuie wu o, evelia nye enyɔ. Zã cross‑encoder model (abe BGE‑reranker-v2, Cohere Rerank) na nu siwo wòkpe (abe 20 gbãtɔ) la, he gblɔ ake le wo ŋu, eye nàxɔ top‑K. Kpekpe (vevietɔ top‑1)
Seƒoƒo ƒe ŋlɔɖoɖo Moseƒoƒo ƒe ŋlɔɖoɖo meʋli o, alo le kpeɖedzi geɖe me, ŋkɔ meʋli o. Zã LLM na seƒoƒo gbãtɔa nanye nyuie wu (abe "iPhone 15 ƒe aɖaɖe ɖe?"); alo zã kpeɖedzikpeɖedzi ŋlɔɖoɖo. Kpekpe +5~15%
HyDE Moseƒoƒo le kpuie alo le gɔmeɖoɖo me (abe "gblɔ tso photosynthesis ŋu"), kxɔxɔ ɖe ŋgɔ maɖo nyuie. Nyi LLM wɔ ŋuɖoɖo si bu kple nu si va yi, eye nàzã ŋuɖoɖo sia ʋu nuŋɔŋlɔ. Nyo na nuƒoƒo si le keke si me, gake menyo na nyatakaka tɔxɛ o
Kxɔxɔ xexlẽme Top‑K ɖoɖo K sue ɖo ate ŋu ablu nyatakaka vevi; K gã ɖo ɖi token kple kpataɖeɖe. Dze aga gblɔ K=3/5/10, kpɔ kpekpe kple ŋuɖoɖo ƒe ʋuʋu. Dɔ wɔwɔ kple nusɔsrɔ̃

etɔ̃, Gbɔgblɔ ƒe ɖoɖo ɖe edzi ɖo (Na LLM nàzã nuƒoƒo nyuie)

Kxɔxɔ nyuie, gake ne prompt mele nyuie o, alo model meɖo o, ekema menye naneke o.

Nusɔsrɔ̃ Nɔnɔme Nu wɔna Eƒe dzesi
Prompt engineering LLM ɖe nuƒoƒo ɖa, alo wɔ nu si menyo o. Gblɔ eme ɖi: "Nyi nu si wòɖe ɖe teƒe siawo me ɖo ŋu. Ne nyatakaka menyo o, alo mele eŋu o, ekema gblɔ 'ɖe nyateƒe me, mesu hena ŋuɖoɖo o.'" Kpe ɖe few‑shot examples ŋu. ʋuʋu +20~40%
Nuƒoƒo ƒe tsitretsitsi Nu si wòkpe la le gãtɔ (wù context window alo gbogbo aɖe nye kpataɖeɖe). LLMLingua alo Selective Context, he tsitre ɖe nyatakaka veviwo ŋu, eye wòaɖoe na LLM. Dze aga ɖe nu si le keke me
LLM model ƒe ʋuʋu Model sue (7B) mete ŋu wɔ numeɖeɖe ɖe eŋu o, alo mese context gã o. Trɔ yi model si wòsẽ wu (GPT‑4o, Claude 3.5 Sonnet, Qwen2.5‑72B). Numemeɖeɖe ƒe nuteƒe dze aga
Trɔtrɔ kple ŋutinya Moseƒoɖela mate ŋu aʋa ŋuɖoɖo ŋu o. Le gbɔgblɔ me, na LLM nàɖo go [citation:1], si sɔ kple kxɔxɔ ƒe xexlẽme. He ɖe gbɔgblɔ si wòwɔ la ŋu. Moseƒoɖela ƒe xɔse + nu ʋɛ
Gbeƒãɖeɖe ƒe sɔsrɔ̃ Model le nu bubu wɔm le afi si me womele be yeawɔ o, alo gblɔ be menyae o le afi si wòle be yeagblɔe. Dzra nɔnɔme ƒe nɔnɔme: ne top‑1 chunk ƒe cosine similarity kple seƒoƒa le afi si te la, le 0.7 ƒe gbɔ, ekema gblɔ na LLM "nyatakaka mele eŋu o." Dze aga ɖe hallucination

ene, ʋɔtɔ kple ŋgɔyiyi (Nyateƒe si wòle be woaxɔ ɖo)

Ne mese nu o, ekema mate ŋu aɖo eɖokui ɖe edzi o.

Nusɔsrɔ̃ Nu wɔna Dzesi
Wɔ ʋɔtɔ gulugbe Dza 100~300 seƒoƒo vavã + ŋuɖoɖo dedie + kxɔxɔ ƒe ʋuʋu ƒe ID. Le afi si te ƒe nɔnɔme, le afi si te ƒe tameɖoɖo me.
Automated ʋɔtɔ RAGAS (Faithfulness, Answer Relevance, Context Recall) alo TruLens. Dzesi ene tɔxɛ: ʋuʋu, ŋuɖoɖo ƒe nɔnɔme, context ƒe ŋkuɖoɖo.
Ame ƒe ʋɔtɔ Le kwasiɖa ɖesia ɖe me, lɔ̃ 20 bad case, he kpɔ vodada ƒe nɔnɔme (kxɔxɔ meɖo o / gbɔgblɔ meɖo o / nyatakaka ƒe gulugbe mele o). Ðoɖo ɖe nusɔsrɔ̃ ŋgɔyiyi.
A/B ʋɔtɔ Le dɔ wɔwɔ ƒe teƒe, dzra kplɔ̃ aɖe ɖe kxɔxɔ ƒe nuwɔna vovovowo (abe BM25 vs mixed). Dzesi si le internet dzi: moseƒoɖela ƒe dzidzo, ŋuɖoɖo maɖe si.

atɔ̃, Nu si wòate ŋu aƒo nu tsoe le kpeɖedzixɔxɔ me (Eƒe ŋusẽkpɔɖeamedzi)

"Le RAG dɔwɔwɔ si me nye me la, baseline kpekpe ɖe 67% ko. Me wɔ nu etɔ̃:
1. Chunk tso fixed 1024 yi dynamic semantic segmentation (he ɖe atade + akpa ŋu), kpekpe yi 74%;
2. Tsa mixed kxɔxɔ (vector + BM25) kple rerank model sue aɖe, kpekpe yi 83%;
3. Wɔ prompt dzi ɖo eye míeɖo se ɖe eŋu be [Wometsɔ nyatakaka aɖeke o] , hallucination tso 22% yi 5% gɔme.

Kple esia, míeɖo dɔ wɔwɔ ƒe ʋɔtɔ ʋɔtɔ ɖe teƒe, eye le tɔtrɔ ɖesia ɖe ŋgɔ la, míeɖe asi le kpeɖedzixɔxɔ 200 ƒe RAGAS dzii, be míakpɔ be mele tetse yiyi o."


Nu si wòle be nàɖo ŋku edzi: RAG ƒe ɖoɖo ɖe edzi ɖo ƒe mɔfiame blibo

Data layer ─→ Aɖaŋuƒoƒo, chunk dzadzraɖo, metadata kpekpe, domain embedding
Kxɔxɔ layer ─→ Mixed kxɔxɔ, rerank, seƒoƒo ŋlɔɖoɖo, HyDE, Top‑K ɖoɖo
Gbɔgblɔ layer ─→ Prompt ŋusẽ, ɖoɖowo, tsitretsitsi, ŋutinya, gbeƒãɖeɖe ƒe nɔnɔme
ʋɔtɔ layer ─→ ʋɔtɔ gulugbe, RAGAS, ame ƒe nuŋlɔɖoɖo, A/B ʋɔtɔ

评论

暂无已展示的评论。

发表评论(匿名)