Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    CastError
Message:      Couldn't cast
task_id: string
path: string
left_context: string
right_context: string
groundtruth: string
crossfile_context: string
node_path: list<item: string>
  child 0, item: string
target_type: string
cursor_line: int64
target: string
file: string
target_nlines: int64
node_depth: int64
to
{'file': Value('string'), 'target_type': Value('string'), 'cursor_line': Value('int64'), 'target_nlines': Value('int64'), 'node_depth': Value('int64'), 'node_path': List(Value('string')), 'target': Value('string')}
because column names don't match
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1779, in _prepare_split_single
                  for key, table in generator:
                                    ^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 299, in _generate_tables
                  self._cast_table(pa_table, json_field_paths=json_field_paths),
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 128, in _cast_table
                  pa_table = table_cast(pa_table, self.info.features.arrow_schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2321, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2249, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              task_id: string
              path: string
              left_context: string
              right_context: string
              groundtruth: string
              crossfile_context: string
              node_path: list<item: string>
                child 0, item: string
              target_type: string
              cursor_line: int64
              target: string
              file: string
              target_nlines: int64
              node_depth: int64
              to
              {'file': Value('string'), 'target_type': Value('string'), 'cursor_line': Value('int64'), 'target_nlines': Value('int64'), 'node_depth': Value('int64'), 'node_path': List(Value('string')), 'target': Value('string')}
              because column names don't match
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 882, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 943, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1646, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1832, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

file
string
target_type
string
cursor_line
int64
target_nlines
int64
node_depth
int64
node_path
list
target
string
chiffre/le-chiffre/src/main/scala/rocketchip/Configs.scala
class_definition
32
1
1
[ "compilation_unit", "class_definition" ]
class LeChiffreConfig extends Config(new WithLeChiffre ++ new DefaultConfig)
chiffre/src/main/scala/chiffre/InjectorInfo.scala
block
31
1
18
[ "compilation_unit", "trait_definition", "template_body", "function_definition", "block", "field_expression", "interpolated_string_expression", "interpolated_string", "interpolation", "block", "call_expression", "field_expression", "call_expression", "arguments", "lambda_expression", "i...
|${fields.map(a => s"${a.serialize(tab + " ")}").mkString("\n")}"""
chiffre/src/main/scala/chiffre/ScanField.scala
class_definition
18
1
1
[ "compilation_unit", "class_definition" ]
case class ScanFieldBindingException(msg: String) extends Exception(msg)
chiffre/src/main/scala/chiffre/inject/CycleInjector.scala
class_definition
21
1
1
[ "compilation_unit", "class_definition" ]
case class Cycle(width: Int) extends ScanField
chiffre/src/main/scala/chiffre/inject/LfsrInjector.scala
class_definition
22
1
1
[ "compilation_unit", "class_definition" ]
case class Difficulty(width: Int) extends ScanField with ProbabilityBind
chiffre/src/main/scala/chiffre/inject/StuckAt.scala
class_definition
58
1
1
[ "compilation_unit", "class_definition" ]
class StuckAtInjectorWithId(bitWidth: Int, val scanId: String) extends StuckAtInjector(bitWidth) with ChiffreInjector
chiffre/src/main/scala/chiffre/passes/FaultInstrumentation.scala
block
64
1
9
[ "compilation_unit", "class_definition", "template_body", "function_definition", "indented_block", "field_expression", "interpolated_string_expression", "interpolated_string", "interpolation", "block" ]
|${indent}renames:
chiffre/src/main/scala/chiffre/passes/ScanChainTransform.scala
block
43
1
9
[ "compilation_unit", "class_definition", "template_body", "function_definition", "indented_block", "field_expression", "interpolated_string_expression", "interpolated_string", "interpolation", "block" ]
|${injectors.map{ case (k, v) => s"${tab} ${v.name}, ${k.module.name}, ${slaveIn(k).module.name}.${slaveIn(k).name}, ${slaveOut(k).module.name}.${slaveOut(k).name}"}.mkString("\n")}
chiffre/src/test/scala/chiffreTests/InstrumentationSpec.scala
class_definition
46
5
1
[ "compilation_unit", "class_definition" ]
behavior of "Chiffree Injectee annotation" it should "emit an annotation" in { val circuit = ChiselStage.elaborate(new DummyInjectee)
chiffre/src/test/scala/chiffreTests/inject/StuckAtInjectorSpec.scala
block
64
1
6
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "call_expression", "block" ]
Driver(() => new StuckAtInjector(8)) { dut => new StuckAtTester(dut) }
diagrammer/src/main/scala/dotvisualizer/RenderSvg.scala
if_expression
32
2
7
[ "compilation_unit", "class_definition", "template_body", "function_definition", "block", "if_expression", "block", "if_expression" ]
|<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" | "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
diagrammer/src/main/scala/dotvisualizer/ToLoFirrtl.scala
function_definition
65
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
private def onMod(mod: DefModule): DefModule = mod.map(onStmt)
diagrammer/src/main/scala/dotvisualizer/dotnodes/LiteralNode.scala
function_definition
6
4
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
def render: String = { s"""$absoluteName [shape="circle" style="filled" BGCOLOR="#C0C0C0" label="$value"] """.stripMargin }
diagrammer/src/main/scala/dotvisualizer/dotnodes/PrimOpNode.scala
block
25
1
7
[ "compilation_unit", "class_definition", "template_body", "val_definition", "interpolated_string_expression", "interpolated_string", "interpolation", "block" ]
override val absoluteName: String = s"op_${name}_${PrimOpNode.hash}"
diagrammer/src/main/scala/dotvisualizer/stage/DiagrammerPhase.scala
object_definition
22
3
1
[ "compilation_unit", "object_definition" ]
val targets: Seq[PhaseManager.PhaseDependency] = Seq( Dependency[CheckPhase], Dependency[GetFirrtlCircuitPhase],
diagrammer/src/main/scala/dotvisualizer/stage/phase/GenerateDotFilePhase.scala
function_definition
22
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
override def prerequisites = Seq(Dependency[OptionallyBuildTargetDirPhase])
diagrammer/src/main/scala/dotvisualizer/stage/phase/OptionallyBuildTargetDirPhase.scala
block
40
1
18
[ "compilation_unit", "class_definition", "template_body", "function_definition", "block", "call_expression", "case_block", "case_clause", "if_expression", "block", "if_expression", "block", "throw_expression", "instance_expression", "arguments", "interpolated_string_expression", "inte...
throw new DiagrammerException(s"Error: Target dir ${targetDir} exists and is not a directory")
diagrammer/src/main/scala/dotvisualizer/transforms/MakeDiagramGroup.scala
class_definition
14
3
1
[ "compilation_unit", "class_definition" ]
override def optionalPrerequisites = Seq.empty
diagrammer/src/main/scala/dotvisualizer/transforms/RemoveTempWires.scala
block
120
3
18
[ "compilation_unit", "class_definition", "template_body", "function_definition", "block", "function_definition", "block", "function_definition", "block", "match_expression", "case_block", "case_clause", "val_definition", "call_expression", "arguments", "call_expression", "arguments", ...
val result = Some(Block(block.stmts.flatMap { substatement => removeGenStatement(substatement) }))
diagrammer/src/test/scala/dotvisualizer/PrintfSpec.scala
block
34
1
11
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "function_definition", "block", "val_definition", "infix_expression", "parenthesized_expression", "if_expression", "block" ]
) ++ (if (showPrintfs) { Seq(ShowPrintfsAnnotation) }
dana/src/main/scala/dana/ActivationFunction.scala
block
222
1
8
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block", "call_expression", "block", "call_expression", "block" ]
} .otherwise { out := one }
dana/src/main/scala/dana/ProcessingElement.scala
class_definition
190
3
1
[ "compilation_unit", "class_definition" ]
} is (PE_states('e_PE_ACTIVATION_FUNCTION)) { reqAf()
dana/src/main/scala/dana/RegisterFile.scala
class_definition
158
3
1
[ "compilation_unit", "class_definition" ]
// the number of expected writes. assert(!Vec((0 until transactionTableNumEntries * 2).map( i => (state(i).valid &&
dana/src/main/scala/dana/util/Util.scala
block
13
5
4
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block" ]
def apply(epb: Int, pes: Int, cache: Int): Int = { var x = epb << (6 + 4); x = x | pes << 4; x = x | cache; x}
dana/src/main/scala/standalone/XFilesTests.scala
class_definition
10
4
1
[ "compilation_unit", "class_definition" ]
abstract class XFilesTests(implicit p: Parameters) extends XFilesTester { // New Transaction
dana/src/main/scala/util/QueueAf.scala
function_definition
10
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
override def cloneType = new QueueIOAf(gen, entries).asInstanceOf[this.type]
dana/src/main/scala/util/SRAM.scala
block
61
1
6
[ "compilation_unit", "class_definition", "template_body", "for_expression", "block", "call_expression", "block" ]
when (io.we(i)) { mem(io.addr(i)) := io.din(i) }
dana/src/main/scala/util/SRAMBlockIncrement.scala
function_definition
68
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
def index(j: Int): (Int, Int) = (elementWidth*(j+1) - 1, elementWidth * j)
dana/src/main/scala/util/SRAMElementCounter.scala
function_definition
20
2
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
override def cloneType = new SRAMElementCounterResp ( sramDepth = sramDepth).asInstanceOf[this.type]
dana/src/main/scala/util/SRAMElementIncrement.scala
function_definition
71
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
def index(j: Int): (Int, Int) = (elementWidth*(j+1) - 1, elementWidth * j)
riscv-mini-five-stage/src/main/scala/riscv_mini_five_stage/IF_ID_Register.scala
class_definition
21
2
1
[ "compilation_unit", "class_definition" ]
val id_pc = Output(UInt(WLEN.W)) val id_pc_4 = Output(UInt(WLEN.W))
riscv-mini-five-stage/src/main/scala/riscv_mini_five_stage/InstCache.scala
object_definition
38
3
1
[ "compilation_unit", "object_definition" ]
object InstCache extends App { chisel3.Driver.execute(args, () => new InstCache) }
riscv-mini-five-stage/src/main/scala/riscv_mini_five_stage/PC.scala
object_definition
30
3
1
[ "compilation_unit", "object_definition" ]
object PC extends App { chisel3.Driver.execute(args, () => new PC) }
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/ALU_Test.scala
block
30
3
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
iotesters.Driver.execute((Array("--is-verbose")), () => new ALU) { c => new ALU_Test(c) }
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/Addr_Buffer_Test.scala
class_definition
20
3
1
[ "compilation_unit", "class_definition" ]
poke(c.io.addr_input, addr) poke(c.io.record, record) peek(c.io.front)
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/Datapath_Test.scala
block
23
3
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
iotesters.Driver.execute(Array("--backend-name", "verilator"), () => new Datapath) { c => new Datapath_Test(c) }
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/ID_EX_Register_Test.scala
block
15
3
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
iotesters.Driver.execute(Array("--backend-name", "-verilator"), () => new ID_EX_Register) { c => new ID_EX_Register_Test(c) }
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/ImmGen_Test.scala
block
23
3
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
iotesters.Driver.execute(Array("--is-verbose"), () => new ImmGen) { c => new ImmGen_Test(c) }
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/RegFile_Test.scala
class_definition
13
1
1
[ "compilation_unit", "class_definition" ]
for(i <- 1 to 10) {
riscv-mini-five-stage/src/test/scala/riscv_mini_five_stage_test/Tile_Test.scala
block
43
3
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
iotesters.Driver.execute(Array("--is-verbose"), () => new Tile){ c => new Tile_Test(c) }
aes_chisel/src/main/scala/aes/InvMixColumns.scala
function_definition
153
1
3
[ "compilation_unit", "object_definition", "template_body", "function_definition" ]
def apply(Pipelined: Boolean = false): InvMixColumns = Module(new InvMixColumns(Pipelined))
aes_chisel/src/main/scala/aes/MixColumns.scala
function_definition
152
1
3
[ "compilation_unit", "object_definition", "template_body", "function_definition" ]
def apply(Pipelined: Boolean = false): MixColumns = Module(new MixColumns(Pipelined))
aes_chisel/src/main/scala/aes/Tables.scala
object_definition
25
3
1
[ "compilation_unit", "object_definition" ]
0xe1.U, 0xf8.U, 0x98.U, 0x11.U, 0x69.U, 0xd9.U, 0x8e.U, 0x94.U, 0x9b.U, 0x1e.U, 0x87.U, 0xe9.U, 0xce.U, 0x55.U, 0x28.U, 0xdf.U, 0x8c.U, 0xa1.U, 0x89.U, 0x0d.U, 0xbf.U, 0xe6.U, 0x42.U, 0x68.U, 0x41.U, 0x99.U, 0x2d.U, 0x0f.U, 0xb0.U, 0x54.U, 0xbb.U, 0x16.U))
aes_chisel/src/main/scala/gcd/GCD.scala
block
25
1
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
.otherwise { y := y - x }
aes_chisel/src/main/scala/lfsr/LFSR.scala
function_definition
33
1
3
[ "compilation_unit", "object_definition", "template_body", "function_definition" ]
def apply(): LFSR = Module(new LFSR())
aes_chisel/src/test/scala/aes/AddRoundKeyUnitTest.scala
block
67
2
6
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "if_expression", "block" ]
if (backendNames.contains("verilator")) { iotesters.Driver.execute(
aes_chisel/src/test/scala/aes/ShiftRowsUnitTest.scala
block
91
3
9
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "if_expression", "block", "infix_expression", "call_expression", "block" ]
"--backend-name", "firrtl", "--generate-vcd-output", "on"), () => new ShiftRows) { c => new ShiftRowsUnitTester(c) } should be(true)
aes_chisel/src/test/scala/aes/UnrolledAESUnitTest.scala
block
62
2
4
[ "compilation_unit", "class_definition", "template_body", "if_expression", "block" ]
Array(0x3d, 0xe2, 0x3a, 0x75, 0x52, 0x47, 0x75, 0xe7, 0x27, 0xbf, 0x9e, 0xb4, 0x54, 0x07, 0xcf, 0x39), Array(0x0b, 0xdc, 0x90, 0x5f, 0xc2, 0x7b, 0x09, 0x48, 0xad, 0x52, 0x45, 0xa4, 0xc1, 0x87, 0x1c, 0x2f),
aes_chisel/src/test/scala/gcd/GCDMain.scala
object_definition
47
3
1
[ "compilation_unit", "object_definition" ]
object GCDRepl extends App { iotesters.Driver.executeFirrtlRepl(args, () => new GCD) }
aes_chisel/src/test/scala/lfsr/LFSRUnitTest.scala
block
57
3
6
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "if_expression", "block" ]
iotesters.Driver.execute( Array("--target-dir", "test_run_dir/" + dir + "_firrtl_test", "--top-name", dir, "--backend-name", "firrtl", "--generate-vcd-output", "on"), () => new LFSR) {
Quasar/design/src/main/scala/dec/dec_decode_ctl.scala
block
543
1
5
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "block" ]
val csr_set_x = withClock(io.active_clk){RegNext(csr_set_d, init=0.B)}
Quasar/design/src/main/scala/dec/dec_ib_ctl.scala
class_definition
9
4
1
[ "compilation_unit", "class_definition" ]
class dec_ib_ctl_IO extends Bundle with param{ val ifu_ib = Flipped(new aln_ib) val ib_exu = Flipped(new ib_exu) val dbg_ib = new dbg_ib
Quasar/design/src/main/scala/dec/dec_tlu_ctl.scala
if_expression
3,068
1
10
[ "compilation_unit", "class_definition", "template_body", "function_definition", "call_expression", "field_expression", "call_expression", "arguments", "lambda_expression", "if_expression", "if_expression" ]
def pattern(y : List[Int]) = (0 until y.size).map(i=> if(y(i)>=0 & y(i)!='z') io.dec_csr_rdaddr_d(y(i)) else if(y(i)<0) !io.dec_csr_rdaddr_d(y(i).abs) else !io.dec_csr_rdaddr_d(0)).reduce(_&_)
Quasar/design/src/main/scala/dma_ctrl.scala
block
119
1
11
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "call_expression", "field_expression", "field_expression", "call_expression", "arguments", "lambda_expression", "call_expression", "block" ]
fifo_error_bus := (0 until DMA_BUF_DEPTH).map(i => withClock(dma_free_clk) {RegNext(Mux(fifo_error_bus_en(i), 1.U, fifo_error_bus(i)) & !fifo_reset(i), 0.U)}).reverse.reduce(Cat(_,_))
Quasar/design/src/main/scala/dmi/dmi_wrapper.scala
class_definition
39
3
1
[ "compilation_unit", "class_definition" ]
val reg_wr_addr = Output(UInt(7.W)) val reg_en = Output(UInt(1.W)) val reg_wr_en = Output(UInt(1.W))
Quasar/design/src/main/scala/exu/exu.scala
class_definition
21
2
1
[ "compilation_unit", "class_definition" ]
val exu_div_wren = Output(UInt(1.W)) // Divide write enable to GPR //debug
Quasar/design/src/main/scala/exu/exu_div_ctl.scala
if_expression
138
1
11
[ "compilation_unit", "class_definition", "template_body", "function_definition", "block", "val_definition", "call_expression", "field_expression", "call_expression", "arguments", "lambda_expression", "if_expression" ]
val pat_b = (0 until y.size).map(i=> if(y(i)>=0) m_ff(y(i)) else !m_ff(y(i).abs)).reduce(_&_)
Quasar/design/src/main/scala/exu/exu_mul_ctl.scala
if_expression
224
1
12
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "arguments", "call_expression", "field_expression", "field_expression", "call_expression", "arguments", "lambda_expression", "if_expression" ]
val shfl2_d = Mux(io.rs2_in(1), Range(0, 15,2).map(i=> if(i<4)Cat(shfl4_d(i+1+4,i+4),shfl4_d(i+1,i))else if(i<8)Cat(shfl4_d(i+9,i+8),shfl4_d(i+5,i+4))else if(i<12)Cat(shfl4_d(i+13,i+12),shfl4_d(i+9,i+8))else Cat(shfl4_d(i+17,i+16),shfl4_d(i+13,i+12))).reverse.reduce(Cat(_,_)), shfl4_d)
Quasar/design/src/main/scala/lib/axi4_to_ahb.scala
block
26
1
5
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "call_expression", "block" ]
dec_tlu_force_halt_bus_q := withClock(io.free_clk) {RegNext(dec_tlu_force_halt_bus_ns, 0.U)}
Quasar/design/src/main/scala/lsu/lsu_stbuf.scala
block
195
1
4
[ "compilation_unit", "class_definition", "template_body", "call_expression", "block" ]
withClock(io.lsu_stbuf_c1_clk){ RdPtr := RegEnable(NxtRdPtr, 0.U, RdPtrEn)}
NagiCore/src/main/scala/nagicore/Main.scala
block
48
3
6
[ "compilation_unit", "object_definition", "template_body", "match_expression", "case_block", "case_clause", "block" ]
case _ => { exportVerilog(() => new nagicore.loongarch.nscscc2024.Core) }
NagiCore/src/main/scala/nagicore/loongarch/nscscc2024/Config.scala
function_definition
14
1
3
[ "compilation_unit", "trait_definition", "template_body", "function_definition" ]
def ICACHE_LINES = 128
NagiCore/src/main/scala/nagicore/loongarch/nscscc2024/stages/IF.scala
if_expression
58
5
3
[ "compilation_unit", "class_definition", "template_body", "if_expression" ]
if(GlobalConfg.SIM){ import nagicore.unit.DPIC_PERF_PIPE val perf_pipe_if = Module(new DPIC_PERF_PIPE()) perf_pipe_if.io.clk := clock perf_pipe_if.io.rst := reset
NagiCore/src/main/scala/nagicore/loongarch/nscscc2024/stages/MEM.scala
if_expression
87
2
4
[ "compilation_unit", "class_definition", "template_body", "val_definition", "if_expression" ]
val wordData = if(XLEN == 64) Mux(addr(2), rdata_raw(63, 32), rdata_raw(31, 0)) else rdata_raw(31, 0)
NagiCore/src/main/scala/nagicore/loongarch/nscscc2024Dual/Config.scala
function_definition
14
1
3
[ "compilation_unit", "trait_definition", "template_body", "function_definition" ]
def ICACHE_LINES = 128
NagiCore/src/main/scala/nagicore/loongarch/nscscc2024Dual/stages/IS.scala
class_definition
70
5
1
[ "compilation_unit", "class_definition" ]
val is2 = issue_buffer.io.rdatas(1) val data_hazard = (is1.rc === is2.ra || is1.rc === is2.rb) && is1.rc =/= 0.U // 只双发is2是ALU类,且无数据冒险的指令 val issue_double = Flags.is(is2.instr_type, CtrlFlags.InstrType.alu) &&
NagiCore/src/main/scala/nagicore/unit/DIVU.scala
match_expression
24
3
3
[ "compilation_unit", "class_definition", "template_body", "match_expression" ]
case DIVU_IMP.radix2 => { /* ref: https://github.com/MaZirui2001/LAdataBitsR-pipeline-scala */
NagiCore/src/main/scala/nagicore/unit/GPR.scala
class_definition
9
2
1
[ "compilation_unit", "class_definition" ]
val wen = Input(Vec(wchannel, Bool())) val waddr = Input(Vec(wchannel, UInt(addrBits.W)))
NagiCore/src/main/scala/nagicore/unit/cache/CachePiped.scala
block
165
5
7
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "arguments", "call_expression", "block" ]
val rdatas = RegEnable(VecInit.tabulate(ways){ i => VecInit.tabulate(num_word){ j => data_bank_io(i)(j).dout } }, pipego_reg)
NagiCore/src/main/scala/nagicore/unit/cache/CacheWT.scala
if_expression
199
1
4
[ "compilation_unit", "class_definition", "template_body", "val_definition", "if_expression" ]
val addr_word_reg = if(len_word!=0) addr_reg(len_word+len_byte-1, len_byte) else 0.U
constellation/src/main/scala/channel/Nodes.scala
class_definition
8
1
1
[ "compilation_unit", "class_definition" ]
case class EmptyParams()
constellation/src/main/scala/channel/WidthWidget.scala
block
96
1
7
[ "compilation_unit", "class_definition", "template_body", "val_definition", "instance_expression", "arguments", "assignment_expression", "block" ]
slaveFn = { s => s.copy(payloadBits=srcBits) }
constellation/src/main/scala/noc/NoC.scala
block
180
1
21
[ "compilation_unit", "class_definition", "template_body", "class_definition", "template_body", "val_definition", "call_expression", "field_expression", "call_expression", "block", "lambda_expression", "indented_block", "call_expression", "field_expression", "parenthesized_expression", "...
(Seq(s"${r.nodeId} $outs $egresses") ++ ingresses).mkString("\n")
constellation/src/main/scala/protocol/AXI4.scala
block
74
1
9
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "arguments", "call_expression", "arguments", "call_expression", "block" ]
val arFIFOMap = WireInit(VecInit(Seq.fill(endId) { true.B }))
constellation/src/main/scala/protocol/Protocol.scala
function_definition
43
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
def getNodesOut(ls: Seq[String]): Seq[Option[Int]] = getNodes(ls, outNodeMapping)
constellation/src/main/scala/router/OutputUnit.scala
block
66
1
9
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "arguments", "call_expression", "arguments", "call_expression", "block" ]
val states = Reg(MixedVec(cParam.virtualChannelParams.map { u => new OutputState(u.bufferSize) }))
constellation/src/main/scala/router/vcalloc/SingleVCAllocator.scala
block
37
1
9
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "arguments", "call_expression", "arguments", "call_expression", "block" ]
val in_alloc = Wire(MixedVec(allOutParams.map { u => Vec(u.nVirtualChannels, Bool()) }))
constellation/src/main/scala/routing/RoutingRelations.scala
function_definition
791
1
7
[ "compilation_unit", "object_definition", "template_body", "function_definition", "lambda_expression", "instance_expression", "template_body", "function_definition" ]
override def getNPrios(src: ChannelRoutingInfo): Int = maxVCs
constellation/src/main/scala/soc/Buses.scala
block
146
1
15
[ "compilation_unit", "class_definition", "template_body", "val_definition", "call_expression", "field_expression", "call_expression", "block", "lambda_expression", "indented_block", "parenthesized_expression", "infix_expression", "infix_expression", "parenthesized_expression", "call_expre...
} .getOrElse { TLAtomicAutomata(arithmetic = pa.arithmetic) })
constellation/src/main/scala/util/Utils.scala
if_expression
14
3
5
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block", "if_expression" ]
val wrap = (value === (n-1).U) Mux(wrap, 0.U, value + 1.U) }
essent/src/main/scala/Driver.scala
match_expression
11
4
5
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block", "match_expression" ]
(new ArgsParser).getConfig(args.toSeq) match { case Some(config) => generate(config) case None => }
essent/src/main/scala/Emitter.scala
block
157
1
11
[ "compilation_unit", "function_definition", "match_expression", "case_block", "case_clause", "match_expression", "case_block", "case_clause", "interpolated_string_expression", "interpolated_string", "interpolation", "block" ]
case Addw => s"${emitExprWrap(p.args(0))}.addw(${emitExprWrap(p.args(1))})"
essent/src/main/scala/IR.scala
block
28
1
7
[ "compilation_unit", "class_definition", "template_body", "function_definition", "interpolated_string_expression", "interpolated_string", "interpolation", "block" ]
def serialize: String = s"if (${wrEn.serialize} && ${wrMask.serialize}) $memName[${wrAddr.serialize}] = ${wrData.serialize}"
essent/src/main/scala/MFFC.scala
function_definition
55
4
3
[ "compilation_unit", "object_definition", "template_body", "function_definition" ]
def apply(g: Graph, excludeSet: Set[NodeID] = Set()): ArrayBuffer[NodeID] = { val worker = new MFFC(g) excludeSet foreach { id => worker.mffc(id) = Excluded } val mffc = worker.findMFFCs()
essent/src/main/scala/StatementGraph.scala
function_definition
85
1
3
[ "compilation_unit", "class_definition", "template_body", "function_definition" ]
def stmtsOrdered(): Seq[Statement] = collectValidStmts(TopologicalSort(this).toSeq)
essent/src/main/scala/disabled/disabledInferAddw.scala
block
55
5
11
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block", "match_expression", "case_block", "case_clause", "match_expression", "case_block", "case_clause", "block" ]
val eName = primE.args.head match { case w: WRef => w.name } if ((addSigs.contains(eName)) && (primE.consts.head == 1) && (bitWidth(primE.tpe) == 64)) Seq((tName, eName)) else Seq() }
essent/src/main/scala/disabled/disabledRandInitInvalids.scala
block
38
1
9
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block", "val_definition", "field_expression", "parenthesized_expression", "infix_expression", "block" ]
val portNames = (m.ports map { _.name }).toSet
essent/src/main/scala/disabled/disabledZeroFromBits.scala
function_definition
36
5
3
[ "compilation_unit", "object_definition", "template_body", "function_definition" ]
def run(c: Circuit): Circuit = { val modulesx = c.modules.map { case m: ExtModule => m case m: Module => simpBitsModule(m) }
essent/src/main/scala/passes/RegFromMem1.scala
block
69
1
7
[ "compilation_unit", "object_definition", "template_body", "function_definition", "block", "val_definition", "infix_expression", "block" ]
val memsWithWrites = memsToReplace filter { _.writers.nonEmpty }
essent/src/test/scala/ReplaceRsvdKeyTest.scala
class_definition
17
4
1
[ "compilation_unit", "class_definition" ]
val resultState = firrtlCompiler.execute(CircuitState(circuit, Seq())) val CorrectReader = Source.fromURL(getClass.getResource("/ReplacedRsvdKey_correct.fir")) val correctString = CorrectReader.getLines().mkString("\n") assert(correctString == resultState.circuit.serialize)
essent/src/test/scala/StatementGraphTest.scala
block
28
1
6
[ "compilation_unit", "class_definition", "template_body", "infix_expression", "block", "call_expression", "block" ]
assertResult(2) { sg.numNodeRefs() }
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/common/InstConfig.scala
identifier
47
1
4
[ "compilation_unit", "trait_definition", "template_body", "val_definition", "identifier" ]
val uInstType = 5.U(InstTypeLen.W)
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/core/ex/ALU.scala
class_definition
34
4
1
[ "compilation_unit", "class_definition" ]
instSLTU -> Mux(io.src1.asUInt < io.src2.asUInt, 1.U(XLen.W), 0.U(XLen.W)), instSLTIU -> Mux(io.src1.asUInt < io.imm.asUInt, 1.U(XLen.W), 0.U(XLen.W)), instSLL -> (io.src1 << io.src2(5, 0))(63, 0), instSRL -> (io.src1 >> io.src2(5, 0)),
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/core/ex/EXU.scala
class_definition
26
3
1
[ "compilation_unit", "class_definition" ]
protected val imm = exReg.imm protected val wen = exReg.wen protected val rs1 = exReg.inst(19, 15)
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/core/ex/Multiplier.scala
block
23
5
6
[ "compilation_unit", "class_definition", "template_body", "function_definition", "block", "if_expression", "block" ]
} else { val done = RegNext(Mux(io.flush, false.B, en), false.B) val bits = RegEnable(data, en) generatePipe(done, bits, latency - 1) }
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/core/if/Cache.scala
class_definition
6
5
1
[ "compilation_unit", "class_definition" ]
class CacheReqIO extends Bundle with InstConfig { val addr = UInt(XLen.W) val data = UInt(XLen.W) // for write val mask = UInt((XLen / 8).W) // for write }
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/port/AXI4IO.scala
class_definition
29
1
1
[ "compilation_unit", "class_definition" ]
class AXI4WIO extends SOCAXI4WIO {}
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/port/BRANCHIO.scala
class_definition
9
2
1
[ "compilation_unit", "class_definition" ]
val taken = Output(Bool()) // is prev branch taken val idx = Output(UInt(GHRLen.W)) // prev idx of PHT(GHRLen)
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/port/COREIO.scala
class_definition
7
5
1
[ "compilation_unit", "class_definition" ]
val globalEn = Output(Bool()) val fetch = Flipped(new IFIO) val ld = Flipped(new LDIO) val sd = Flipped(new SDIO) }
tree-core-cpu/rtl/TreeCoreL2/tc_l2/src/main/scala/port/IFIO.scala
class_definition
6
5
1
[ "compilation_unit", "class_definition" ]
class IFIO extends Bundle with IOConfig { val en = Output(Bool()) val addr = Output(UInt(XLen.W)) val data = Input(UInt(InstLen.W)) }
End of preview.

HDL-RepoBench Sample

This is a compact, deterministic sample of MHRC-Bench for lightweight inspection and upload tests. It preserves the original language directories and JSONL file names while keeping the output below 2 GB.

Sampling policy:

  • Languages: chisel, hls, systemverilog, vhdl.
  • Validation and test splits: copied in full by default.
  • Training split: repository-stratified deterministic sample.
  • Raw repository archive: omitted because original_repo/collected_repos.tar.gz is about 33 GB in the source dataset.

This generated sample contains 6,537 JSONL records and occupies 217.0 MiB on disk. See sample_manifest.json for exact per-file counts, byte sizes, and generation arguments.

Downloads last month
4