I am use modulus to solve a 3D PDE equation. Just PINN, not DeepONet.

The boundary conditional of my PDE equation is not just a value, and is not easy to write as a sympy funtion.

So I have generated a dataset of points, which contains 10000 boundary coordinate points (x,y,z) and the corrosponding true value for the pde (u).

I looked at Constraints and found it seems that I should use SupervisedGridConstraint to add those boundary points as a Boundary Condition.

So I write the code as:

```
classical_method_boundar_r_1 = np.loadtxt('../classical_method_results/grid_r_is_1.txt',unpack=True)
x_bc,y_bc,z_bc,u_bc = classical_method_boundar_r_1
from modulus.domain.constraint.discrete import DictGridDataset,SupervisedGridConstraint
bc = PointwiseBoundaryConstraint(
nodes=nodes,
geometry=geo,
outvar={"u": 0},
batch_size=params['batch_size_bc'],
lambda_weighting={"u": 1},
quasirandom = True,
fixed_dataset = False,
)
domain.add_constraint(bc, "bc")
```

where the grid_r_is_1.txt is:

```
-0.140437 -0.0724004 0.987439 0.0087098
-0.135563 -0.0811617 0.987439 0.00870911
-0.130142 -0.0895963 0.987439 0.00870838
-0.124197 -0.09767 0.987439 0.00870761
...
```

Unfortunatly, it crashes and says:

```
Cell In [20], line 15
12 slv = Solver(cfg, domain)
14 # start solver
---> 15 slv.solve(callback=print)
File /modulus/modulus/solver/solver.py:159, in Solver.solve(self, sigterm_handler, callback, if_save)
157 def solve(self, sigterm_handler=None, callback=None, if_save=True):
158 if self.cfg.run_mode == "train":
--> 159 self._train_loop(sigterm_handler, callback=callback, if_save=if_save)
160 elif self.cfg.run_mode == "eval":
161 self._eval()
File /modulus/modulus/trainer.py:524, in Trainer._train_loop(self, sigterm_handler, callback, if_save)
520 if self.cfg.cuda_graphs:
521 # If cuda graphs statically load it into defined allocations
522 self.load_data(static=True)
--> 524 loss, losses = self._cuda_graph_training_step(step)
525 else:
526 # Load all data for constraints
527 self.load_data()
File /modulus/modulus/trainer.py:718, in Trainer._cuda_graph_training_step(self, step)
715 self.global_optimizer_model.zero_grad(set_to_none=True)
...
27 norm = self.weight.norm(dim=1, p=2, keepdim=True)
28 weight = self.weight_g * self.weight / norm
---> 29 return F.linear(input, weight, self.bias)
RuntimeError: size mismatch, got 10, 10x3,300
```

I totally have no idea about how to debug it.

Could anyone give me some suggestion?

Thanks in advance!