H-matrix subblocks

44 views
Skip to first unread message

simon.d...@gmail.com

unread,
Jan 22, 2020, 4:59:07 AM1/22/20
to bempp
Dear Bempp team,

I am quite new to the BEMpp package (3.4.3) and I can't seem to figure out the following:
How do I find the subblock in the matrix form of a discrete boundary operator that corresponds to an admissible leaf_node in its block tree? Specifically, I am running the following code:


import bempp.api
import numpy as np

import matplotlib.pyplot as plt
import numpy.linalg as LA

#Initialise Helmholtz problem

k
=1;
grid
= bempp.api.shapes.regular_sphere(4)
space
= bempp.api.function_space(grid, 'DP', 0)
bempp
.api.global_parameters.hmat.eps=1e-7;

#Form hmat and dense SLP operator and their matrices and generate block cluster tree

bempp
.api.global_parameters.assembly.boundary_operator_assembly_type='hmat'
discrete_operator
= bempp.api.operators.boundary.helmholtz.single_layer(
    space
, space, space,k,use_projection_spaces=False).weak_form()
elements
= grid.leaf_view.elements;
bempp
.api.global_parameters.assembly.boundary_operator_assembly_type='dense'
discrete_operator_dense
= bempp.api.operators.boundary.helmholtz.single_layer(
    space
, space, space,k).weak_form()
tree
= bempp.api.hmatrix_interface.block_cluster_tree(discrete_operator)
Hmat=bempp.api.as_matrix(discrete_operator)
Mat=bempp.api.as_matrix(discrete_operator_dense)

#Find largest admissible block
admissible_nodes
= [node for node in tree.leaf_nodes if node.admissible]
ranks
= [bempp.api.hmatrix_interface.data_block(discrete_operator, node).rank
       
for node in admissible_nodes]
mx
=0;
row_range
=[];
column_range
=[];
largest_block
=admissible_nodes[0];
for i in range(len(admissible_nodes)):
    node
=admissible_nodes[i];
    mx_test
=max(mx,node.row_cluster_range[1]-node.row_cluster_range[0],node.column_cluster_range[1]-node.column_cluster_range[0])
   
if mx_test>mx:
        mx
=mx_test;
        row_range
=node.row_cluster_range;
        column_range
=node.column_cluster_range;
        largest_block
=admissible_nodes[i];

#Return low rank representation of block

A=bempp.api.hmatrix_interface.data_block(discrete_operator, largest_block).A
B=bempp.api.hmatrix_interface.data_block(discrete_operator, largest_block).B
M=np.matmul(A,B)

#The following is where I struggle:

M1=Hmat[row_range[0]:row_range[1],column_range[0]:column_range[1]]
M2=Mat[row_range[0]:row_range[1],column_range[0]:column_range[1]]
print(np.linalg.norm(M1-M2)/np.linalg.norm(M1))
#appropriately small-> Hmat is good approx in subblock [row_range[0]:row_range[1],column_range[0]:column_range[1]
print(np.linalg.norm(M1-np.matmul(A,B))/np.linalg.norm(M1)) #Not small: A*B does not correspond to the selected subblock of Hmat

The last large error suggests that I am mssing something about the difference between row_cluster_range and column_cluster range on the one hand, and the matrix indices on the other hand, but after digging around in the code I still can't figure out what.
I apologise if this is a stupid question, but I don't have a documentation available to me, and I can't seem to find the answer in the code.

Kind regards,
Simon Dirckx


Reply all
Reply to author
Forward
Message has been deleted
0 new messages