• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 286
  • Last Modified:

Using vectors and references - Cannot convert from vector to &


void setBit(BIT &aBit, const bool aValue){

int main(){
      vector<BIT> bitVec[3];
      setBit(bitVec[0], false);
      setBit(bitVec[1], false);
      setBit(bitVec[2], false);
      return false;

I get errors about not being able to convert from a std::vector to 'BIT &'?
1 Solution
Change vector<BIT> bitVec[3]; to vector<BIT> bitVec; (the way it is currently you are making an array of 3 vector<BIT> and bitVec[0] accesses a vector, not a BIT)  You will then need to use bitVec.push_back(someBIT) where someBIT is of type BIT inorder to add elements to the array.
Unimatrix_001Author Commented:
Thanks. I knew it would be something silly... Time to sleep I think. :)
Actually, you'd better use :

        vector<BIT> bitVec(3);

(to reserve space for 3 BITs in the vector) because immediately after you try to access the vector using bitVec[0], bitVec[1] and bitVec[2].

Featured Post

[Webinar] Cloud and Mobile-First Strategy

Maybe you’ve fully adopted the cloud since the beginning. Or maybe you started with on-prem resources but are pursuing a “cloud and mobile first” strategy. Getting to that end state has its challenges. Discover how to build out a 100% cloud and mobile IT strategy in this webinar.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now