```python
def chunk_operations_by_size(operations):
current_chunk = []
current_size = 0
chunks = []
for operation in operations:
operation_size = operation.ByteSize() # AttributeError: Unknown field for MutateOperation: ByteSize
operation_size = protobuf_helpers.get_bytes_size(operation) # Cannot find reference 'get_bytes_size' in 'protobuf_helpers.py'
operation_size = len(operation.SerializeToString()) # AttributeError: Unknown field for MutateOperation: SerializeToString
if operation_size > SAFE_BYTE_THRESHOLD:
raise ValueError(
f"Operation size {operation_size} exceeds maximum size {SAFE_BYTE_THRESHOLD}."
)
if current_size + operation_size > SAFE_BYTE_THRESHOLD:
chunks.append(current_chunk)
current_chunk = []
current_size = 0
current_chunk.append(operation)
current_size += operation_size
if current_chunk:
chunks.append(current_chunk)
return chunks
```
This code shows the effect of wanting to get the number of bytes for each operation and limiting its maximum number of bytes.
But I can't get the exact number of bytes for each operation because these three methods are not available:
```
operation_size = operation.ByteSize() # AttributeError: Unknown field for MutateOperation: ByteSize
operation_size = protobuf_helpers.get_bytes_size(operation) # Cannot find reference 'get_bytes_size' in 'protobuf_helpers.py'
operation_size = len(operation.SerializeToString()) # AttributeError: Unknown field for MutateOperation: SerializeToString
```
How can I get the exact number of bytes for each operation?
ps: Please don't ask me to provide logs anymore. I know the problem I encountered very well, that is, a single `AddBatchJobOperationsRequest` exceeds the byte size limit. I have to split operations, but now I can't get the size of each operation. So, please give me a direct solution on how to get the size of the operation, thank you!