Fix MLIR Transform Tutorial Doc (#155285)
Fixes a small issue I noticed while reading through the tutorial.
GitOrigin-RevId: 179f01b800e29b38f7d97c043ff331d4f202a12a
diff --git a/docs/Tutorials/transform/Ch0.md b/docs/Tutorials/transform/Ch0.md
index dc4b753..0d7a703 100644
--- a/docs/Tutorials/transform/Ch0.md
+++ b/docs/Tutorials/transform/Ch0.md
@@ -134,7 +134,7 @@
## “Loop” Fusion
-Since the region of the `linalg.generic` operation can contain arbitrarily many operations, we can use it to express “fusion” of the implicit loops by simply having more operations chained in the region. For example, the common machine learning rectified linear unit layer (ReLU), which can be defined as `relu(x) = max(0, x)`, can be defined be expressed using the “compare-and-select” idiom in one `linalg.generic` operation, without the temporary buffer for the comparison result and without repeating the outer operation:
+Since the region of the `linalg.generic` operation can contain arbitrarily many operations, we can use it to express “fusion” of the implicit loops by simply having more operations chained in the region. For example, the common machine learning rectified linear unit layer (ReLU), which can be defined as `relu(x) = max(0, x)`, can be expressed using the “compare-and-select” idiom in one `linalg.generic` operation, without the temporary buffer for the comparison result and without repeating the outer operation:
```mlir
linalg.generic {