whoadiz 0 Newbie Poster

Hello,
I am writing a program that creates a tree (similar to BST) where each node is made up of an arrayList (which is limited to a certain number of items each node) and 2 pointers (left and right children)

The strings are made up of 0 and 1. The tree will sort left or right based on this, however, at each new level the charAt index to be evaluated depends on the level of that node.
Example:
Trying to add 1011001001 at the root level that's full already will evaluate 1011001001 and send it to that node's Right child.
Say that next level gets filled up and then we tried to add 1011011101 to it, it will evaluate 1011011101 and send it to that node's Left child.

Here is what I have..
Node class

public Node(String s){
        elements = new ArrayList<String>();
        elements.add(s);
    }
    public boolean isEmpty(){
        return elements.size()==0;
    }
    
    public boolean isFull(){
        return elements.size()==size;
    }
    
    public void addString(String s){
        elements.add(s);
       // Collections.sort(elements);
    }

Tree Class

public void insert(String s){

        root = insert(root,s);
    }
        
    private Node insert(Node n, String s){

        if (n == null){
            n = new Node(s);
        } else if (!n.isFull()){
            n.addString(s);
        } else {
            if (s.charAt(0) == '0'){
                n.leftC = insert(n.leftC,s);
            } else if (s.charAt(0) == '1'){
                n.rightC = insert(n.rightC,s);
            }
        }
        return (n);
    }

Problem I'm running in to is...I can't figure out how to implement the check charAt for the index @ the right level.

All it's making right now is a triangle shaped "tree" with all the 1s on the right and 0's on the left.

Example:
Root node:
[1011001001, 1011101101, 1111011011, 1011011101, 0101111011]

Left Child Node [0111001000, 0111111111, 0010101010, 0100000000, 0011010000]
Right Child Node [1111011111, 1011010110, 1001101000, 1000001110, 1110000000]

right child @ right [1100000011, 1100000000, 1011100000, 1011011011, 1010101010]
left child @ right - none
right child @ left - none
left child @ left [0010010101, 0010101011, 0110000100, 0011000000, 0010110111]

As you can see that's incorrect. Manually doing it would give these results:

right child @ right [1100000011, 1100000000]
left child @ right [1011100000]

right child @ left [0110000100, 0111001000]
left child @ left [0010010101, 0010101011, 0011000000,0010110111,0001110000]

I've tried setting a "depth" value which calls a method I made to evaluate the height of the Tree at a specific node.. but that didn't work out at all.

Any input is GREATLY appreciated.