Is that me or nobody ever tried to run native compute on universe polymorphic data? I am seeing stuff that is outright wrong in the implementation and that just can't work.
For instance, we have this call
let ml_of_level l =
match Univ.Level.var_index l with
| Some i ->
(* FIXME: use a proper cast function *)
let univ = MLprimitive (MLmagic, [|MLlocal (Option.get instance)|]) in
MLprimitive (MLarrayget, [|univ; MLint i|])
| None -> let i = push_symbol (SymbLevel l) in get_level_code i
in
But MLarrayget expects a persistent array as its argument, not an OCaml array
how can this even work?
There are also ominous FIXMEs here and there, like
let accu =
let args =
if Int.equal (Univ.AbstractContext.size u) 0 then
[|get_ind_code j; MLarray [||]|]
else [|get_ind_code j|]
in
(* FIXME: pass universes here *)
Glet(name, MLprimitive (Mk_ind, args))
in
IIUC the code we just don't even pass any instance to the inductive there when it's polymorphic
No, MLarrayget
is plain Array.get
, so it takes an OCaml array.
Right, I got confused with the Arrayget primitive constructor.
The other snippet still seems very suspicious though
Last updated: Oct 13 2024 at 01:02 UTC