Wuhuwill commited on
Commit
ba95f3f
·
verified ·
1 Parent(s): 079bd0f

Add Dataset card and documentation

Browse files
Files changed (1) hide show
  1. README.md +93 -0
README.md ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+ language:
4
+ - en
5
+ license: mit
6
+ size_categories:
7
+ - 10K<n<100K
8
+ task_categories:
9
+ - question-answering
10
+ - text-analysis
11
+ tags:
12
+ - knowledge-coupling
13
+ - llama2
14
+ - hotpotqa
15
+ - multi-hop-reasoning
16
+ - gradient-analysis
17
+ - ripple-effects
18
+ ---
19
+
20
+ # Knowledge Coupling Analysis on HotpotQA Dataset
21
+
22
+ ## Dataset Description
23
+
24
+ This dataset contains the results of a comprehensive knowledge coupling analysis performed on the HotpotQA dataset using LLaMA2-7B model. The analysis investigates how different pieces of knowledge interact within the model's parameter space through gradient-based coupling measurements.
25
+
26
+ ## Research Overview
27
+
28
+ - **Model**: meta-llama/Llama-2-7b-hf (layers 28-31 focused analysis)
29
+ - **Dataset**: HotpotQA (train + dev splits, 97,852 total samples)
30
+ - **Method**: Gradient-based knowledge coupling via cosine similarity
31
+ - **Target Layers**: model.layers.28-31.mlp.down_proj (semantically rich layers)
32
+
33
+ ## Key Findings
34
+
35
+ The analysis revealed:
36
+ - Mean coupling score: 0.0222 across all knowledge piece pairs
37
+ - High coupling pairs (≥0.4 threshold): Critical for ripple effect prediction
38
+ - Layer-specific analysis focusing on MLP down-projection layers
39
+ - Comprehensive gradient analysis with 180,355,072 dimensions per knowledge piece
40
+
41
+ ## Files Description
42
+
43
+ ### Core Results
44
+ - `global_analysis_results.json`: Comprehensive analysis summary with statistics
45
+ - `all_knowledge_pieces.json`: Complete set of processed knowledge pieces (92MB)
46
+ - `all_coupling_pairs.csv`: All pairwise coupling measurements (245MB)
47
+
48
+ ### Supporting Files
49
+ - `dataset_info.json`: Dataset statistics and conversion details
50
+ - `coupling_analysis_config.json`: Analysis configuration and parameters
51
+
52
+ ## Usage
53
+
54
+ ```python
55
+ from datasets import load_dataset
56
+
57
+ # Load the knowledge coupling results
58
+ dataset = load_dataset("your-username/hotpotqa-knowledge-coupling")
59
+
60
+ # Access global analysis results
61
+ global_results = dataset["global_analysis"]
62
+
63
+ # Access knowledge pieces
64
+ knowledge_pieces = dataset["knowledge_pieces"]
65
+
66
+ # Access coupling pairs
67
+ coupling_pairs = dataset["coupling_pairs"]
68
+ ```
69
+
70
+ ## Citation
71
+
72
+ If you use this dataset in your research, please cite:
73
+
74
+ ```bibtex
75
+ @dataset{hotpotqa_knowledge_coupling,
76
+ title={Knowledge Coupling Analysis on HotpotQA Dataset using LLaMA2-7B},
77
+ author={[Your Name]},
78
+ year={2024},
79
+ publisher={HuggingFace},
80
+ url={https://huggingface.co/datasets/your-username/hotpotqa-knowledge-coupling}
81
+ }
82
+ ```
83
+
84
+ ## Technical Details
85
+
86
+ - **Gradient Computation**: ∇_θ log P(answer|question) for cloze-style questions
87
+ - **Coupling Measurement**: Cosine similarity between L2-normalized gradients
88
+ - **Memory Optimization**: Focused on layers 28-31 to handle GPU memory constraints
89
+ - **Hardware**: NVIDIA A40 GPU (46GB VRAM)
90
+
91
+ ## License
92
+
93
+ This dataset is released under the MIT License. The original HotpotQA dataset follows its respective licensing terms.