Maintains a persistent, interlinked markdown wiki between immutable raw sources and answers: ingest, query, lint, index and log—compounding knowledge instead of one-shot RAG.
94
94%
Does it follow best practices?
Impact
96%
0.97xAverage score across 3 eval scenarios
Passed
No known issues
A data science team maintains a shared knowledge base wiki for their research. A new team member, Priya, has written a technical summary on neural network architectures that includes several diagrams stored as local image files alongside the article. The team uses the wiki to compound knowledge across projects, so images referenced in wiki pages need to be stored locally under the wiki's assets folder—not referenced as external or raw-path links—so the wiki remains self-contained and portable.
Priya's article is ready in the raw sources folder. The team wants it ingested into the wiki, with all referenced images available under the wiki's asset folder (wiki/assets/) and wiki pages referencing them with relative paths. The existing wiki already has a few topic pages on related subjects that should be cross-linked where appropriate.
Ingest the raw source into the wiki. The expected outputs are:
wiki/topics/ covering the main concepts in the article (convolutional networks, residual connections)wiki/index.md with links and one-line descriptions for new or changed pageswiki/log.md using the correct heading formatThe following files are provided as inputs. Extract them before beginning.
=============== FILE: wiki/index.md ===============
=============== FILE: wiki/topics/gradient-descent.md ===============
Gradient descent is an optimization algorithm used to minimize a loss function by iteratively moving in the direction of the steepest descent.
A learning rate of 0.01 is a common default starting point.
=============== FILE: wiki/topics/backpropagation.md ===============
Backpropagation is the algorithm used to compute gradients of a loss function with respect to model parameters using the chain rule.
It is used in conjunction with gradient descent to train neural networks.
=============== FILE: wiki/log.md ===============
=============== FILE: raw/2026-04-06-cnn-architectures.md ===============
Date: 2026-04-06 Author: Priya Sharma
Convolutional Neural Networks (CNNs) use spatial feature extraction through learned filters. The core operation is the convolution, which slides a filter across the input to produce a feature map.
Key diagram showing the CNN pipeline:

ResNets (He et al., 2016) introduced skip connections that allow gradients to flow directly across layers:
y = F(x) + x
This solved the vanishing gradient problem for very deep networks.
Residual block diagram:

CNNs are trained with gradient descent and backpropagation. The depth enabled by residual connections makes the optimization landscape more well-behaved.
=============== FILE: images/cnn-pipeline.png =============== <binary image file — treat this as an existing image file at path images/cnn-pipeline.png>
=============== FILE: images/residual-block.png =============== <binary image file — treat this as an existing image file at path images/residual-block.png>